diff --git a/README.md b/README.md index 9467c2ce16..6184181e91 100644 --- a/README.md +++ b/README.md @@ -91,6 +91,19 @@ These examples provide more thorough mathematical treatment on a select group of - [Latent Dirichlet Allocation (LDA)](scientific_details_of_algorithms/lda_topic_modeling) dives into Amazon SageMaker's spectral decomposition approach to LDA. - [Linear Learner features](scientific_details_of_algorithms/linear_learner_class_weights_loss_functions) shows how to use the class weights and loss functions features of the SageMaker Linear Learner algorithm to improve performance on a credit card fraud prediction task +### Amazon SageMaker Debugger +These examples provide and introduction to SageMaker Debugger which allows debugging and monitoring capabilities for training of machine learning and deep learning algorithms. Note that although these notebooks focus on a specific framework, the same approach works with all the frameworks that Amazon SageMaker Debugger supports. The notebooks below are listed in the order in which we recommend you review them. + +- [Using a built-in rule with TensorFlow](sagemaker-debugger/tensorflow_builtin_rule/) +- [Using a custom rule with TensorFlow Keras](sagemaker-debugger/tensorflow_keras_custom_rule/) +- [Interactive tensor analysis in notebook with MXNet](sagemaker-debugger/mnist_tensor_analysis/) +- [Real-time analysis in notebook with MXNet](sagemaker-debugger/mxnet_realtime_analysis/) +- [Using a built in rule with XGBoost](sagemaker-debugger/xgboost_builtin_rules/) +- [Real-time analysis in notebook with XGBoost](sagemaker-debugger/xgboost_realtime_analysis/) +- [Using SageMaker Debugger with Managed Spot Training and MXNet](sagemaker-debugger/mxnet_spot_training/) +- [Reacting to CloudWatch Events from Rules to take an action based on status with TensorFlow](sagemaker-debugger/tensorflow_action_on_rule/) +- [Using SageMaker Debugger with a custom PyTorch container](sagemaker-debugger/pytorch_custom_container/) + ### Advanced Amazon SageMaker Functionality These examples that showcase unique functionality available in Amazon SageMaker. They cover a broad range of topics and will utilize a variety of methods, but aim to provide the user with sufficient insight or inspiration to develop within Amazon SageMaker. @@ -123,6 +136,13 @@ These examples provide you an introduction to how to use Neo to optimizes deep l - [Distributed TensorFlow](sagemaker_neo_compilation_jobs/tensorflow_distributed_mnist) Adapts form [tensorflow mnist](sagemaker-python-sdk/tensorflow_distributed_mnist) including Neo API and comparsion between the baseline - [Predicting Customer Churn](sagemaker_neo_compilation_jobs/xgboost_customer_churn) Adapts form [xgboost customer churn](introduction_to_applying_machine_learning/xgboost_customer_churn) including Neo API and comparsion between the baseline +### Amazon SageMaker Procesing + +These examples show you how to use SageMaker Processing jobs to run data processing workloads. + +- [Scikit-Learn Data Processing and Model Evaluation](sagemaker_processing/scikit_learn_data_processing_and_model_evaluation) shows how to use SageMaker Processing and the Scikit-Learn container to run data preprocessing and model evaluation workloads. +- [Feature transformation with Amazon SageMaker Processing and SparkML](sagemaker_processing/feature_transformation_with_sagemaker_processing) shows how to use SageMaker Processing to run data processing workloads using SparkML prior to training. + ### Amazon SageMaker Pre-Built Framework Containers and the Python SDK #### Pre-Built Deep Learning Framework Containers diff --git a/autopilot/sagemaker_autopilot_direct_marketing.ipynb b/autopilot/sagemaker_autopilot_direct_marketing.ipynb new file mode 100644 index 0000000000..513c9513bf --- /dev/null +++ b/autopilot/sagemaker_autopilot_direct_marketing.ipynb @@ -0,0 +1,551 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Direct Marketing with Amazon SageMaker Autopilot\n", + "---\n", + "\n", + "---" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Contents\n", + "\n", + "1. [Introduction](#Introduction)\n", + "1. [Prerequisites](#Prerequisites)\n", + "1. [Downloading the dataset](#Downloading)\n", + "1. [Upload the dataset to Amazon S3](#Uploading)\n", + "1. [Setting up the SageMaker Autopilot Job](#Settingup)\n", + "1. [Launching the SageMaker Autopilot Job](#Launching)\n", + "1. [Tracking Sagemaker Autopilot Job Progress](#Tracking)\n", + "1. [Results](#Results)\n", + "1. [Cleanup](#Cleanup)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Introduction\n", + "\n", + "Amazon SageMaker Autopilot is an automated machine learning (commonly referred to as AutoML) solution for tabular datasets. You can use SageMaker Autopilot in different ways: on autopilot (hence the name) or with human guidance, without code through SageMaker Studio, or using the AWS SDKs. This notebook, as a first glimpse, will use the AWS SDKs to simply create and deploy a machine learning model.\n", + "\n", + "A typical introductory task in machine learning (the \"Hello World\" equivalent) is one that uses a dataset to predict whether a customer will enroll for a term deposit at a bank, after one or more phone calls. For more information about the task and the dataset used, see [Bank Marketing Data Set](https://archive.ics.uci.edu/ml/datasets/bank+marketing).\n", + "\n", + "Direct marketing, through mail, email, phone, etc., is a common tactic to acquire customers. Because resources and a customer's attention are limited, the goal is to only target the subset of prospects who are likely to engage with a specific offer. Predicting those potential customers based on readily available information like demographics, past interactions, and environmental factors is a common machine learning problem. You can imagine that this task would readily translate to marketing lead prioritization in your own organization.\n", + "\n", + "This notebook demonstrates how you can use Autopilot on this dataset to get the most accurate ML pipeline through exploring a number of potential options, or \"candidates\". Each candidate generated by Autopilot consists of two steps. The first step performs automated feature engineering on the dataset and the second step trains and tunes an algorithm to produce a model. When you deploy this model, it follows similar steps. Feature engineering followed by inference, to decide whether the lead is worth pursuing or not. The notebook contains instructions on how to train the model as well as to deploy the model to perform batch predictions on a set of leads. Where it is possible, use the Amazon SageMaker Python SDK, a high level SDK, to simplify the way you interact with Amazon SageMaker.\n", + "\n", + "Other examples demonstrate how to customize models in various ways. For instance, models deployed to devices typically have memory constraints that need to be satisfied as well as accuracy. Other use cases have real-time deployment requirements and latency constraints. For now, keep it simple." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Prerequisites\n", + "\n", + "Before you start the tasks in this tutorial, do the following:\n", + "\n", + "- The Amazon Simple Storage Service (Amazon S3) bucket and prefix that you want to use for training and model data. This should be within the same Region as Amazon SageMaker training. The code below will create, or if it exists, use, the default bucket.\n", + "- The IAM role to give Autopilot access to your data. See the Amazon SageMaker documentation for more information on IAM roles: https://docs.aws.amazon.com/sagemaker/latest/dg/security-iam.html" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "import boto3\n", + "from sagemaker import get_execution_role\n", + "\n", + "region = boto3.Session().region_name\n", + "\n", + "session = sagemaker.Session()\n", + "bucket = session.default_bucket()\n", + "prefix = 'sagemaker/autopilot-dm'\n", + "\n", + "role = get_execution_role()\n", + "\n", + "sm = boto3.Session().client(service_name='sagemaker',region_name=region)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Downloading the dataset\n", + "Download the [direct marketing dataset](https://archive.ics.uci.edu/ml/datasets/bank+marketing) from the University of California, Irvine ML repository." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!wget -N https://archive.ics.uci.edu/ml/machine-learning-databases/00222/bank-additional.zip\n", + "!unzip -o bank-additional.zip\n", + "\n", + "local_data_path = './bank-additional/bank-additional-full.csv'\n" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "toc-hr-collapsed": true + }, + "source": [ + "## Upload the dataset to Amazon S3\n", + "\n", + "Before you run Autopilot on the dataset, first perform a check of the dataset to make sure that it has no obvious errors. The Autopilot process can take long time, and it's generally a good practice to inspect the dataset before you start a job. This particular dataset is small, so you can inspect it in the notebook instance itself. If you have a larger dataset that will not fit in a notebook instance memory, inspect the dataset offline using a big data analytics tool like Apache Spark. [Deequ](https://github.com/awslabs/deequ) is a library built on top of Apache Spark that can be helpful for performing checks on large datasets. Autopilot is capable of handling datasets up to 5 GB.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Read the data into a Pandas data frame and take a look." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "\n", + "data = pd.read_csv(local_data_path, sep=';')\n", + "pd.set_option('display.max_columns', 500) # Make sure we can see all of the columns\n", + "pd.set_option('display.max_rows', 10) # Keep the output on one page\n", + "data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Note that there are 20 features to help predict the target column 'y'.\n", + "\n", + "Amazon SageMaker Autopilot takes care of preprocessing your data for you. You do not need to perform conventional data preprocssing techniques such as handling missing values, converting categorical features to numeric features, scaling data, and handling more complicated data types.\n", + "\n", + "Moreover, splitting the dataset into training and validation splits is not necessary. Autopilot takes care of this for you. You may, however, want to split out a test set. That's next, although you use it for batch inference at the end instead of testing the model.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Reserve some data for calling batch inference on the model\n", + "\n", + "Divide the data into training and testing splits. The training split is used by SageMaker Autopilot. The testing split is reserved to perform inference using the suggested model.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "train_data = data.sample(frac=0.8,random_state=200)\n", + "\n", + "test_data = data.drop(train_data.index)\n", + "\n", + "test_data_no_target = test_data.drop(columns=['y'])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Upload the dataset to Amazon S3\n", + "Copy the file to Amazon Simple Storage Service (Amazon S3) in a .csv format for Amazon SageMaker training to use." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "train_file = 'train_data.csv';\n", + "train_data.to_csv(train_file, index=False, header=True)\n", + "train_data_s3_path = session.upload_data(path=train_file, key_prefix=prefix + \"/train\")\n", + "print('Train data uploaded to: ' + train_data_s3_path)\n", + "\n", + "test_file = 'test_data.csv';\n", + "test_data_no_target.to_csv(test_file, index=False, header=False)\n", + "test_data_s3_path = session.upload_data(path=test_file, key_prefix=prefix + \"/test\")\n", + "print('Test data uploaded to: ' + test_data_s3_path)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setting up the SageMaker Autopilot Job\n", + "\n", + "After uploading the dataset to Amazon S3, you can invoke Autopilot to find the best ML pipeline to train a model on this dataset. \n", + "\n", + "The required inputs for invoking a Autopilot job are:\n", + "* Amazon S3 location for input dataset and for all output artifacts\n", + "* Name of the column of the dataset you want to predict (`y` in this case) \n", + "* An IAM role\n", + "\n", + "Currently Autopilot supports only tabular datasets in CSV format. Either all files should have a header row, or the first file of the dataset, when sorted in alphabetical/lexical order, is expected to have a header row." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "input_data_config = [{\n", + " 'DataSource': {\n", + " 'S3DataSource': {\n", + " 'S3DataType': 'S3Prefix',\n", + " 'S3Uri': 's3://{}/{}/train'.format(bucket,prefix)\n", + " }\n", + " },\n", + " 'TargetAttributeName': 'y'\n", + " }\n", + " ]\n", + "\n", + "output_data_config = {\n", + " 'S3OutputPath': 's3://{}/{}/output'.format(bucket,prefix)\n", + " }" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can also specify the type of problem you want to solve with your dataset (`Regression, MulticlassClassification, BinaryClassification`). In case you are not sure, SageMaker Autopilot will infer the problem type based on statistics of the target column (the column you want to predict). \n", + "\n", + "You have the option to limit the running time of a SageMaker Autopilot job by providing either the maximum number of pipeline evaluations or candidates (one pipeline evaluation is called a `Candidate` because it generates a candidate model) or providing the total time allocated for the overall Autopilot job. Under default settings, this job takes about four hours to run. This varies between runs because of the nature of the exploratory process Autopilot uses to find optimal training parameters." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Launching the SageMaker Autopilot Job\n", + "\n", + "You can now launch the Autopilot job by calling the `create_auto_ml_job` API. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from time import gmtime, strftime, sleep\n", + "timestamp_suffix = strftime('%d-%H-%M-%S', gmtime())\n", + "\n", + "auto_ml_job_name = 'automl-banking-' + timestamp_suffix\n", + "print('AutoMLJobName: ' + auto_ml_job_name)\n", + "\n", + "sm.create_auto_ml_job(AutoMLJobName=auto_ml_job_name,\n", + " InputDataConfig=input_data_config,\n", + " OutputDataConfig=output_data_config,\n", + " RoleArn=role)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tracking SageMaker Autopilot job progress\n", + "SageMaker Autopilot job consists of the following high-level steps : \n", + "* Analyzing Data, where the dataset is analyzed and Autopilot comes up with a list of ML pipelines that should be tried out on the dataset. The dataset is also split into train and validation sets.\n", + "* Feature Engineering, where Autopilot performs feature transformation on individual features of the dataset as well as at an aggregate level.\n", + "* Model Tuning, where the top performing pipeline is selected along with the optimal hyperparameters for the training algorithm (the last stage of the pipeline). " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print ('JobStatus - Secondary Status')\n", + "print('------------------------------')\n", + "\n", + "\n", + "describe_response = sm.describe_auto_ml_job(AutoMLJobName=auto_ml_job_name)\n", + "print (describe_response['AutoMLJobStatus'] + \" - \" + describe_response['AutoMLJobSecondaryStatus'])\n", + "job_run_status = describe_response['AutoMLJobStatus']\n", + " \n", + "while job_run_status not in ('Failed', 'Completed', 'Stopped'):\n", + " describe_response = sm.describe_auto_ml_job(AutoMLJobName=auto_ml_job_name)\n", + " job_run_status = describe_response['AutoMLJobStatus']\n", + " \n", + " print (describe_response['AutoMLJobStatus'] + \" - \" + describe_response['AutoMLJobSecondaryStatus'])\n", + " sleep(30)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "toc-hr-collapsed": true + }, + "source": [ + "## Results\n", + "\n", + "Now use the describe_auto_ml_job API to look up the best candidate selected by the SageMaker Autopilot job. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "best_candidate = sm.describe_auto_ml_job(AutoMLJobName=auto_ml_job_name)['BestCandidate']\n", + "best_candidate_name = best_candidate['CandidateName']\n", + "print(best_candidate)\n", + "print('\\n')\n", + "print(\"CandidateName: \" + best_candidate_name)\n", + "print(\"FinalAutoMLJobObjectiveMetricName: \" + best_candidate['FinalAutoMLJobObjectiveMetric']['MetricName'])\n", + "print(\"FinalAutoMLJobObjectiveMetricValue: \" + str(best_candidate['FinalAutoMLJobObjectiveMetric']['Value']))" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "toc-hr-collapsed": false + }, + "source": [ + "### Perform batch inference using the best candidate\n", + "\n", + "Now that you have successfully completed the SageMaker Autopilot job on the dataset, create a model from any of the candidates by using [Inference Pipelines](https://docs.aws.amazon.com/sagemaker/latest/dg/inference-pipelines.html). " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "model_name = 'automl-banking-model-' + timestamp_suffix\n", + "\n", + "model = sm.create_model(Containers=best_candidate['InferenceContainers'],\n", + " ModelName=model_name,\n", + " ExecutionRoleArn=role)\n", + "\n", + "print('Model ARN corresponding to the best candidate is : {}'.format(model['ModelArn']))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can use batch inference by using Amazon SageMaker batch transform. The same model can also be deployed to perform online inference using Amazon SageMaker hosting." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "transform_job_name = 'automl-banking-transform-' + timestamp_suffix\n", + "\n", + "transform_input = {\n", + " 'DataSource': {\n", + " 'S3DataSource': {\n", + " 'S3DataType': 'S3Prefix',\n", + " 'S3Uri': test_data_s3_path\n", + " }\n", + " },\n", + " 'ContentType': 'text/csv',\n", + " 'CompressionType': 'None',\n", + " 'SplitType': 'Line'\n", + " }\n", + "\n", + "transform_output = {\n", + " 'S3OutputPath': 's3://{}/{}/inference-results'.format(bucket,prefix),\n", + " }\n", + "\n", + "transform_resources = {\n", + " 'InstanceType': 'ml.m5.4xlarge',\n", + " 'InstanceCount': 1\n", + " }\n", + "\n", + "sm.create_transform_job(TransformJobName = transform_job_name,\n", + " ModelName = model_name,\n", + " TransformInput = transform_input,\n", + " TransformOutput = transform_output,\n", + " TransformResources = transform_resources\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Watch the transform job for completion." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print ('JobStatus')\n", + "print('----------')\n", + "\n", + "\n", + "describe_response = sm.describe_transform_job(TransformJobName = transform_job_name)\n", + "job_run_status = describe_response['TransformJobStatus']\n", + "print (job_run_status)\n", + "\n", + "while job_run_status not in ('Failed', 'Completed', 'Stopped'):\n", + " describe_response = sm.describe_transform_job(TransformJobName = transform_job_name)\n", + " job_run_status = describe_response['TransformJobStatus']\n", + " print (job_run_status)\n", + " sleep(30)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now let's view the results of the transform job:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "s3_output_key = '{}/inference-results/test_data.csv.out'.format(prefix);\n", + "local_inference_results_path = 'inference_results.csv'\n", + "\n", + "s3 = boto3.resource('s3')\n", + "inference_results_bucket = s3.Bucket(session.default_bucket())\n", + "\n", + "inference_results_bucket.download_file(s3_output_key, local_inference_results_path);\n", + "\n", + "data = pd.read_csv(local_inference_results_path, sep=';')\n", + "pd.set_option('display.max_rows', 10) # Keep the output on one page\n", + "data" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### View other candidates explored by SageMaker Autopilot\n", + "You can view all the candidates (pipeline evaluations with different hyperparameter combinations) that were explored by SageMaker Autopilot and sort them by their final performance metric." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "candidates = sm.list_candidates_for_auto_ml_job(AutoMLJobName=auto_ml_job_name, SortBy='FinalObjectiveMetricValue')['Candidates']\n", + "index = 1\n", + "for candidate in candidates:\n", + " print (str(index) + \" \" + candidate['CandidateName'] + \" \" + str(candidate['FinalAutoMLJobObjectiveMetric']['Value']))\n", + " index += 1" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Candidate Generation Notebook\n", + " \n", + "Sagemaker AutoPilot also auto-generates a Candidate Definitions notebook. This notebook can be used to interactively step through the various steps taken by the Sagemaker Autopilot to arrive at the best candidate. This notebook can also be used to override various runtime parameters like parallelism, hardware used, algorithms explored, feature extraction scripts and more.\n", + " \n", + "The notebook can be downloaded from the following Amazon S3 location:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sm.describe_auto_ml_job(AutoMLJobName=auto_ml_job_name)['AutoMLJobArtifacts']['CandidateDefinitionNotebookLocation']\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Data Exploration Notebook\n", + "Sagemaker Autopilot also auto-generates a Data Exploration notebook, which can be downloaded from the following Amazon S3 location:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sm.describe_auto_ml_job(AutoMLJobName=auto_ml_job_name)['AutoMLJobArtifacts']['DataExplorationNotebookLocation']\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Cleanup\n", + "\n", + "The Autopilot job creates many underlying artifacts such as dataset splits, preprocessing scripts, or preprocessed data, etc. This code, when un-commented, deletes them. This operation deletes all the generated models and the auto-generated notebooks as well. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#s3 = boto3.resource('s3')\n", + "#bucket = s3.Bucket(bucket)\n", + "\n", + "#job_outputs_prefix = '{}/output/{}'.format(prefix,auto_ml_job_name)\n", + "#bucket.objects.filter(Prefix=job_outputs_prefix).delete()" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_amazonei_mxnet_p27", + "language": "python", + "name": "conda_amazonei_mxnet_p27" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 2 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython2", + "version": "2.7.15" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/aws_sagemaker_studio/sagemaker_experiments/mnist-handwritten-digits-classification-experiment.ipynb b/aws_sagemaker_studio/sagemaker_experiments/mnist-handwritten-digits-classification-experiment.ipynb new file mode 100644 index 0000000000..10c73dad6b --- /dev/null +++ b/aws_sagemaker_studio/sagemaker_experiments/mnist-handwritten-digits-classification-experiment.ipynb @@ -0,0 +1,467 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## MNIST Handwritten Digits Classification Experiment\n", + "\n", + "This demo shows how you can use SageMaker Experiment Management Python SDK to organize, track, compare, and evaluate your machine learning (ML) model training experiments.\n", + "\n", + "You can track artifacts for experiments, including data sets, algorithms, hyper-parameters, and metrics. Experiments executed on SageMaker such as SageMaker Autopilot jobs and training jobs are automatically tracked will be automatically tracked. You can also track artifacts for additional steps within an ML workflow that come before/after model training e.g. data pre-processing or post-training model evaluation.\n", + "\n", + "The APIs also let you search and browse your current and past experiments, compare experiments, and identify best performing models.\n", + "\n", + "Now we will demonstrate these capabilities through an MNIST handwritten digits classification example. The experiment will be organized as follow:\n", + "\n", + "1. Download and prepare the MNIST dataset.\n", + "2. Train a Convolutional Neural Network (CNN) Model. Tune the hyper parameter that configures the number of hidden channels in the model. Track the parameter configurations and resulting model accuracy using SageMaker Experiments Python SDK.\n", + "3. Finally use the search and analytics capabilities of Python SDK to search, compare and evaluate the performance of all model versions generated from model tuning in Step 2.\n", + "4. We will also see an example of tracing the complete linage of a model version i.e. the collection of all the data pre-processing and training configurations and inputs that went into creating that model version.\n", + "\n", + "Make sure you selected `Python 3 (Data Science)` kernel." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Install Python SDKs" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sys" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!{sys.executable} -m pip install sagemaker-experiments" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Install PyTroch" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!{sys.executable} -m pip install torch\n", + "!{sys.executable} -m pip install torchvision" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Setup" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import time\n", + "\n", + "import boto3\n", + "import numpy as np\n", + "import pandas as pd\n", + "%config InlineBackend.figure_format = 'retina'\n", + "from matplotlib import pyplot as plt\n", + "from torchvision import datasets, transforms\n", + "\n", + "import sagemaker\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "from sagemaker.analytics import ExperimentAnalytics\n", + "\n", + "from smexperiments.experiment import Experiment\n", + "from smexperiments.trial import Trial\n", + "from smexperiments.trial_component import TrialComponent\n", + "from smexperiments.tracker import Tracker" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sess = boto3.Session()\n", + "sm = sess.client('sagemaker')\n", + "role = get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Create a S3 bucket to hold data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# create a s3 bucket to hold data, note that your account might already created a bucket with the same name\n", + "account_id = sess.client('sts').get_caller_identity()[\"Account\"]\n", + "bucket = 'sagemaker-experiments-{}-{}'.format(sess.region_name, account_id)\n", + "prefix = 'mnist'\n", + "\n", + "try:\n", + " if sess.region_name == \"us-east-1\":\n", + " sess.client('s3').create_bucket(Bucket=bucket)\n", + " else:\n", + " sess.client('s3').create_bucket(Bucket=bucket, \n", + " CreateBucketConfiguration={'LocationConstraint': sess.region_name})\n", + "except Exception as e:\n", + " print(e)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Dataset\n", + "We download the MNIST hand written digits dataset, and then apply transformation on each of the image." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# download the dataset\n", + "# this will not only download data to ./mnist folder, but also load and transform (normalize) them\n", + "train_set = datasets.MNIST('mnist', train=True, transform=transforms.Compose([\n", + " transforms.ToTensor(),\n", + " transforms.Normalize((0.1307,), (0.3081,))]), \n", + " download=True)\n", + " \n", + "test_set = datasets.MNIST('mnist', train=False, transform=transforms.Compose([\n", + " transforms.ToTensor(),\n", + " transforms.Normalize((0.1307,), (0.3081,))]),\n", + " download=False)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "plt.imshow(train_set.data[2].numpy())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After transforming the images in the dataset, we upload it to s3." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "inputs = sagemaker.Session().upload_data(path='mnist', bucket=bucket, key_prefix=prefix)\n", + "print('input spec: {}'.format(inputs))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now lets track the parameters from the data pre-processing step." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "with Tracker.create(display_name=\"Preprocessing\", sagemaker_boto_client=sm) as tracker:\n", + " tracker.log_parameters({\n", + " \"normalization_mean\": 0.1307,\n", + " \"normalization_std\": 0.3081,\n", + " })\n", + " # we can log the s3 uri to the dataset we just uploaded\n", + " tracker.log_input(name=\"mnist-dataset\", media_type=\"s3/uri\", value=inputs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Step 1 - Set up the Experiment\n", + "\n", + "Create an experiment to track all the model training iterations. Experiments are a great way to organize your data science work. You can create experiments to organize all your model development work for : [1] a business use case you are addressing (e.g. create experiment named “customer churn prediction”), or [2] a data science team that owns the experiment (e.g. create experiment named “marketing analytics experiment”), or [3] a specific data science and ML project. Think of it as a “folder” for organizing your “files”." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Create an Experiment" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "mnist_experiment = Experiment.create(\n", + " experiment_name=f\"mnist-hand-written-digits-classification-{int(time.time())}\", \n", + " description=\"Classification of mnist hand-written digits\", \n", + " sagemaker_boto_client=sm)\n", + "print(mnist_experiment)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Step 2 - Track Experiment\n", + "### Now create a Trial for each training run to track the it's inputs, parameters, and metrics.\n", + "While training the CNN model on SageMaker, we will experiment with several values for the number of hidden channel in the model. We will create a Trial to track each training job run. We will also create a TrialComponent from the tracker we created before, and add to the Trial. This will enrich the Trial with the parameters we captured from the data pre-processing stage.\n", + "\n", + "Note the execution of the following code takes a while." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.pytorch import PyTorch" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hidden_channel_trial_name_map = {}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you want to run the following training jobs asynchronously, you may need to increase your resource limit. Otherwise, you can run them sequentially." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "preprocessing_trial_component = tracker.trial_component" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "for i, num_hidden_channel in enumerate([2, 5, 10, 20, 32]):\n", + " # create trial\n", + " trial_name = f\"cnn-training-job-{num_hidden_channel}-hidden-channels-{int(time.time())}\"\n", + " cnn_trial = Trial.create(\n", + " trial_name=trial_name, \n", + " experiment_name=mnist_experiment.experiment_name,\n", + " sagemaker_boto_client=sm,\n", + " )\n", + " hidden_channel_trial_name_map[num_hidden_channel] = trial_name\n", + " \n", + " # associate the proprocessing trial component with the current trial\n", + " cnn_trial.add_trial_component(preprocessing_trial_component)\n", + " \n", + " # all input configurations, parameters, and metrics specified in estimator \n", + " # definition are automatically tracked\n", + " estimator = PyTorch(\n", + " entry_point='./mnist.py',\n", + " role=role,\n", + " sagemaker_session=sagemaker.Session(sagemaker_client=sm),\n", + " framework_version='1.1.0',\n", + " train_instance_count=1,\n", + " train_instance_type='ml.c4.xlarge',\n", + " hyperparameters={\n", + " 'epochs': 2,\n", + " 'backend': 'gloo',\n", + " 'hidden_channels': num_hidden_channel,\n", + " 'dropout': 0.2,\n", + " 'optimizer': 'sgd'\n", + " },\n", + " metric_definitions=[\n", + " {'Name':'train:loss', 'Regex':'Train Loss: (.*?);'},\n", + " {'Name':'test:loss', 'Regex':'Test Average loss: (.*?),'},\n", + " {'Name':'test:accuracy', 'Regex':'Test Accuracy: (.*?)%;'}\n", + " ],\n", + " enable_sagemaker_metrics=True,\n", + " )\n", + " \n", + " cnn_training_job_name = \"cnn-training-job-{}\".format(int(time.time()))\n", + " \n", + " # Now associate the estimator with the Experiment and Trial\n", + " estimator.fit(\n", + " inputs={'training': inputs}, \n", + " job_name=cnn_training_job_name,\n", + " experiment_config={\n", + " \"TrialName\": cnn_trial.trial_name,\n", + " \"TrialComponentDisplayName\": \"Training\",\n", + " },\n", + " wait=True,\n", + " )\n", + " \n", + " # give it a while before dispatching the next training job\n", + " time.sleep(2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Compare the model training runs for an experiment\n", + "\n", + "Now we will use the analytics capabilities of Python SDK to query and compare the training runs for identifying the best model produced by our experiment. You can retrieve trial components by using a search expression." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Some Simple Analyses" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "search_expression = {\n", + " \"Filters\":[\n", + " {\n", + " \"Name\": \"DisplayName\",\n", + " \"Operator\": \"Equals\",\n", + " \"Value\": \"Training\",\n", + " }\n", + " ],\n", + "}" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "trial_component_analytics = ExperimentAnalytics(\n", + " sagemaker_session=Session(sess, sm), \n", + " experiment_name=mnist_experiment.experiment_name,\n", + " search_expression=search_expression,\n", + " sort_by=\"metrics.test:accuracy.max\",\n", + " sort_order=\"Descending\",\n", + " metric_names=['test:accuracy'],\n", + " parameter_names=['hidden_channels', 'epochs', 'dropout', 'optimizer']\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "trial_component_analytics.dataframe()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To isolate and measure the impact of change in hidden channels on model accuracy, we vary the number of hidden channel and fix the value for other hyperparameters.\n", + "\n", + "Next let's look at an example of tracing the lineage of a model by accessing the data tracked by SageMaker Experiments for `cnn-training-job-2-hidden-channels` trial" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "lineage_table = ExperimentAnalytics(\n", + " sagemaker_session=Session(sess, sm), \n", + " search_expression={\n", + " \"Filters\":[{\n", + " \"Name\": \"Parents.TrialName\",\n", + " \"Operator\": \"Equals\",\n", + " \"Value\": hidden_channel_trial_name_map[2]\n", + " }]\n", + " },\n", + " sort_by=\"CreationTime\",\n", + " sort_order=\"Ascending\",\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "lineage_table.dataframe()" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (Data Science)", + "language": "python", + "name": "python3__SAGEMAKER_INTERNAL__arn:aws:sagemaker:us-east-2:429704687514:environment/datascience" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.3" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/aws_sagemaker_studio/sagemaker_experiments/mnist.py b/aws_sagemaker_studio/sagemaker_experiments/mnist.py new file mode 100644 index 0000000000..ee598a8eb8 --- /dev/null +++ b/aws_sagemaker_studio/sagemaker_experiments/mnist.py @@ -0,0 +1,222 @@ +import argparse +import json +import logging +import os +from os.path import join +import sagemaker_containers +import sys +import torch +import torch.distributed as dist +import torch.nn as nn +import torch.nn.functional as F +import torch.optim as optim +import torch.utils.data +import torch.utils.data.distributed +from torchvision import datasets, transforms + +import boto3 + +import time + +logger = logging.getLogger(__name__) +logger.setLevel(logging.DEBUG) +logger.addHandler(logging.StreamHandler(sys.stdout)) + +if 'SAGEMAKER_METRICS_DIRECTORY' in os.environ: + log_file_handler = logging.FileHandler(join(os.environ['SAGEMAKER_METRICS_DIRECTORY'], "metrics.json")) + log_file_handler.setFormatter( + "{'time':'%(asctime)s', 'name': '%(name)s', \ + 'level': '%(levelname)s', 'message': '%(message)s'}" + ) + logger.addHandler(log_file_handler) + +# Based on https://github.com/pytorch/examples/blob/master/mnist/main.py +class Net(nn.Module): + def __init__(self, hidden_channels, kernel_size=5, drop_out=.5): + super(Net, self).__init__() + self.conv1 = nn.Conv2d(1, hidden_channels, kernel_size=kernel_size) + self.conv2 = nn.Conv2d(hidden_channels, 20, kernel_size=kernel_size) + self.conv2_drop = nn.Dropout2d(p=drop_out) + self.fc1 = nn.Linear(320, 50) + self.fc2 = nn.Linear(50, 10) + + def forward(self, x): + x = F.relu(F.max_pool2d(self.conv1(x), 2)) + x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2)) + x = x.view(-1, 320) + x = F.relu(self.fc1(x)) + x = F.dropout(x, training=self.training) + x = self.fc2(x) + return F.log_softmax(x, dim=1) + + +def _get_train_data_loader(batch_size, training_dir, is_distributed, **kwargs): + logger.info("Get train data loader") + dataset = datasets.MNIST(training_dir, train=True, transform=transforms.Compose([ + transforms.ToTensor(), + transforms.Normalize((0.1307,), (0.3081,)) + ]), download=False) + train_sampler = torch.utils.data.distributed.DistributedSampler(dataset) if is_distributed else None + return torch.utils.data.DataLoader(dataset, batch_size=batch_size, shuffle=train_sampler is None, + sampler=train_sampler, **kwargs) + + +def _get_test_data_loader(test_batch_size, training_dir, **kwargs): + logger.info("Get test data loader") + return torch.utils.data.DataLoader( + datasets.MNIST(training_dir, train=False, transform=transforms.Compose([ + transforms.ToTensor(), + transforms.Normalize((0.1307,), (0.3081,)) + ]), download=False), + batch_size=test_batch_size, shuffle=True, **kwargs) + + +def _average_gradients(model): + # Gradient averaging. + size = float(dist.get_world_size()) + for param in model.parameters(): + dist.all_reduce(param.grad.data, op=dist.reduce_op.SUM) + param.grad.data /= size + + +def train(args, tracker=None): + is_distributed = len(args.hosts) > 1 and args.backend is not None + logger.debug("Distributed training - {}".format(is_distributed)) + use_cuda = args.num_gpus > 0 + logger.debug("Number of gpus available - {}".format(args.num_gpus)) + kwargs = {'num_workers': 1, 'pin_memory': True} if use_cuda else {} + device = torch.device("cuda" if use_cuda else "cpu") + + if is_distributed: + # Initialize the distributed environment. + world_size = len(args.hosts) + os.environ['WORLD_SIZE'] = str(world_size) + host_rank = args.hosts.index(args.current_host) + os.environ['RANK'] = str(host_rank) + dist.init_process_group(backend=args.backend, rank=host_rank, world_size=world_size) + logger.info('Initialized the distributed environment: \'{}\' backend on {} nodes. '.format( + args.backend, dist.get_world_size()) + 'Current host rank is {}. Number of gpus: {}'.format( + dist.get_rank(), args.num_gpus)) + + # set the seed for generating random numbers + torch.manual_seed(args.seed) + if use_cuda: + torch.cuda.manual_seed(args.seed) + + train_loader = _get_train_data_loader(args.batch_size, args.data_dir, is_distributed, **kwargs) + test_loader = _get_test_data_loader(args.test_batch_size, args.data_dir, **kwargs) + + logger.info("Processes {}/{} ({:.0f}%) of train data".format( + len(train_loader.sampler), len(train_loader.dataset), + 100. * len(train_loader.sampler) / len(train_loader.dataset) + )) + + logger.info("Processes {}/{} ({:.0f}%) of test data".format( + len(test_loader.sampler), len(test_loader.dataset), + 100. * len(test_loader.sampler) / len(test_loader.dataset) + )) + + model = Net(args.hidden_channels, args.kernel_size, args.dropout).to(device) + if is_distributed and use_cuda: + # multi-machine multi-gpu case + model = torch.nn.parallel.DistributedDataParallel(model) + else: + # single-machine multi-gpu case or single-machine or multi-machine cpu case + model = torch.nn.DataParallel(model) + + if args.optimizer == 'sgd': + optimizer = optim.SGD(model.parameters(), lr=args.lr, momentum=args.momentum) + else: + optimizer = optim.Adam(model.parameters(), lr=args.lr) + + for epoch in range(1, args.epochs + 1): + model.train() + for batch_idx, (data, target) in enumerate(train_loader, 1): + data, target = data.to(device), target.to(device) + optimizer.zero_grad() + output = model(data) + loss = F.nll_loss(output, target) + loss.backward() + if is_distributed and not use_cuda: + # average gradients manually for multi-machine cpu case only + _average_gradients(model) + optimizer.step() + if batch_idx % args.log_interval == 0: + logger.info('Train Epoch: {} [{}/{} ({:.0f}%)], Train Loss: {:.6f};'.format( + epoch, batch_idx * len(data), len(train_loader.sampler), + 100. * batch_idx / len(train_loader), loss.item())) + test(model, test_loader, device, tracker) + save_model(model, args.model_dir) + + +def test(model, test_loader, device, tracker=None): + model.eval() + test_loss = 0 + correct = 0 + with torch.no_grad(): + for data, target in test_loader: + data, target = data.to(device), target.to(device) + output = model(data) + test_loss += F.nll_loss(output, target, size_average=False).item() # sum up batch loss + pred = output.max(1, keepdim=True)[1] # get the index of the max log-probability + correct += pred.eq(target.view_as(pred)).sum().item() + + test_loss /= len(test_loader.dataset) + logger.info('Test Average loss: {:.4f}, Test Accuracy: {:.0f}%;\n'.format( + test_loss, 100. * correct / len(test_loader.dataset))) + + +def model_fn(model_dir): + device = torch.device("cuda" if torch.cuda.is_available() else "cpu") + model = torch.nn.DataParallel(Net()) + with open(os.path.join(model_dir, 'model.pth'), 'rb') as f: + model.load_state_dict(torch.load(f)) + return model.to(device) + +def save_model(model, model_dir): + logger.info("Saving the model.") + path = os.path.join(model_dir, 'model.pth') + # recommended way from http://pytorch.org/docs/master/notes/serialization.html + torch.save(model.cpu().state_dict(), path) + + +if __name__ == '__main__': + parser = argparse.ArgumentParser() + + # Data and model checkpoints directories + parser.add_argument('--batch-size', type=int, default=64, metavar='N', + help='input batch size for training (default: 64)') + parser.add_argument('--test-batch-size', type=int, default=1000, metavar='N', + help='input batch size for testing (default: 1000)') + parser.add_argument('--epochs', type=int, default=10, metavar='N', + help='number of epochs to train (default: 10)') + parser.add_argument('--optimizer', type=str, default="sgd", + help='optimizer for training.') + parser.add_argument('--lr', type=float, default=0.01, metavar='LR', + help='learning rate (default: 0.01)') + parser.add_argument('--dropout', type=float, default=0.5, metavar='DROP', + help='dropout rate (default: 0.5)') + parser.add_argument('--kernel_size', type=int, default=5, metavar='KERNEL', + help='conv2d filter kernel size (default: 5)') + parser.add_argument('--momentum', type=float, default=0.5, metavar='M', + help='SGD momentum (default: 0.5)') + parser.add_argument('--hidden_channels', type=int, default=10, + help='number of channels in hidden conv layer') + parser.add_argument('--seed', type=int, default=1, metavar='S', + help='random seed (default: 1)') + parser.add_argument('--log-interval', type=int, default=100, metavar='N', + help='how many batches to wait before logging training status') + parser.add_argument('--backend', type=str, default=None, + help='backend for distributed training (tcp, gloo on cpu and gloo, nccl on gpu)') + + + # Container environment + parser.add_argument('--hosts', type=list, default=json.loads(os.environ['SM_HOSTS'])) + parser.add_argument('--current-host', type=str, default=os.environ['SM_CURRENT_HOST']) + parser.add_argument('--model-dir', type=str, default=os.environ['SM_MODEL_DIR']) + parser.add_argument('--data-dir', type=str, default=os.environ['SM_CHANNEL_TRAINING']) + parser.add_argument('--num-gpus', type=int, default=os.environ['SM_NUM_GPUS']) + + args = parser.parse_args() + + train(args) \ No newline at end of file diff --git a/sagemaker-debugger/README.md b/sagemaker-debugger/README.md new file mode 100644 index 0000000000..1d34a581ff --- /dev/null +++ b/sagemaker-debugger/README.md @@ -0,0 +1,13 @@ +## Amazon SageMaker Debugger Examples + +These examples provide and introduction to SageMaker Debugger which allows debugging and monitoring capabilities for training of machine learning and deep learning algorithms. Note that although these notebooks focus on a specific framework, the same approach works with all the frameworks that Amazon SageMaker Debugger supports. The notebooks below are listed in the order in which we recommend you review them. + +- [Using a built-in rule with TensorFlow](tensorflow_builtin_rule/) +- [Using a custom rule with TensorFlow Keras](tensorflow_keras_custom_rule/) +- [Interactive tensor analysis in notebook with MXNet](mnist_tensor_analysis/) +- [Real-time analysis in notebook with MXNet](mxnet_realtime_analysis/) +- [Using a built in rule with XGBoost](xgboost_builtin_rules/) +- [Real-time analysis in notebook with XGBoost](xgboost_realtime_analysis/) +- [Using SageMaker Debugger with Managed Spot Training and MXNet](mxnet_spot_training/) +- [Reacting to CloudWatch Events from Rules to take an action based on status with TensorFlow](tensorflow_action_on_rule/) +- [Using SageMaker Debugger with a custom PyTorch container](pytorch_custom_container/) diff --git a/sagemaker-debugger/mnist_tensor_analysis/images/example.gif b/sagemaker-debugger/mnist_tensor_analysis/images/example.gif new file mode 100644 index 0000000000..e6c6a71201 Binary files /dev/null and b/sagemaker-debugger/mnist_tensor_analysis/images/example.gif differ diff --git a/sagemaker-debugger/mnist_tensor_analysis/mnist_tensor_analysis.ipynb b/sagemaker-debugger/mnist_tensor_analysis/mnist_tensor_analysis.ipynb new file mode 100644 index 0000000000..eeacce2cd0 --- /dev/null +++ b/sagemaker-debugger/mnist_tensor_analysis/mnist_tensor_analysis.ipynb @@ -0,0 +1,1071 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Tensor analysis using Amazon SageMaker Debugger\n", + "\n", + "Looking at the distributions of activation inputs/outputs, gradients and weights per layer can give useful insights. For instance, it helps to understand whether the model runs into problems like neuron saturation, whether there are layers in your model that are not learning at all or whether the network consists of too many layers etc. \n", + "\n", + "The following animation shows the distribution of gradients of a convolutional layer from an example application as the training progresses. We can see that it starts as Gaussian distribution but then becomes more and more narrow. We can also see that the range of gradients starts very small (order of $1e-5$) and becomes even tinier as training progresses. If tiny gradients are observed from the start of training, it is an indication that we should check the hyperparameters of our model. \n", + "\n", + "![](images/example.gif)\n", + "\n", + "In this notebook we will train a poorly configured neural network and use Amazon SageMaker Debugger with custom rules to aggregate and analyse specific tensors. Before we proceed let us install the smdebug binary which allows us to perform interactive analysis in this notebook. After installing it, please restart the kernel, and when you come back skip this cell.\n", + "\n", + "### Installing smdebug" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "! python -m pip install smdebug" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Configuring the inputs for the training job\n", + "\n", + "Now we'll call the Sagemaker MXNet Estimator to kick off a training job . The `entry_point_script` points to the MXNet training script. The users can create a custom *SessionHook* in their training script. If they chose not to create such hook in the training script (similar to the one we will be using in this example) Amazon SageMaker Debugger will create the appropriate *SessionHook* based on specified *DebugHookConfig* parameters.\n", + "\n", + "The `hyperparameters` are the parameters that will be passed to the training script. We choose `Uniform(1)` as initializer and learning rate of `0.001`. This leads to the model not training well because the model is poorly initialized.\n", + "\n", + "The goal of a good intialization is \n", + "- to break the symmetry such that parameters do not receive same gradients and updates\n", + "- to keep variance similar across layers\n", + "\n", + "A bad intialization may lead to vanishing or exploiding gradients and the model not training at all. Once the training is finished we will look at the distirbutions of activation inputs/outputs, gradients and weights across the training to see how these hyperparameters influenced the training.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "entry_point_script = 'mnist.py'\n", + "bad_hyperparameters = {'initializer': 2, 'lr': 0.001}" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "import sagemaker\n", + "from sagemaker.mxnet import MXNet\n", + "from sagemaker.debugger import DebuggerHookConfig, CollectionConfig\n", + "import boto3\n", + "import os\n", + "\n", + "sagemaker_session = sagemaker.Session()\n", + "BUCKET_NAME = sagemaker_session.default_bucket()\n", + "LOCATION_IN_BUCKET = 'smdebug-mnist-tensor-analysis'\n", + "\n", + "s3_bucket_for_tensors = 's3://{BUCKET_NAME}/{LOCATION_IN_BUCKET}'.format(BUCKET_NAME=BUCKET_NAME, LOCATION_IN_BUCKET=LOCATION_IN_BUCKET)\n", + "estimator = MXNet(role=sagemaker.get_execution_role(),\n", + " base_job_name='mxnet',\n", + " train_instance_count=1,\n", + " train_instance_type='ml.m5.xlarge',\n", + " train_volume_size=400,\n", + " source_dir='src',\n", + " entry_point=entry_point_script,\n", + " hyperparameters=bad_hyperparameters,\n", + " framework_version='1.6.0',\n", + " py_version='py3',\n", + " debugger_hook_config = DebuggerHookConfig(\n", + " s3_output_path=s3_bucket_for_tensors, \n", + " collection_configs=[\n", + " CollectionConfig(\n", + " name=\"all\",\n", + " parameters={\n", + " \"include_regex\": \".*\",\n", + " \"save_interval\": \"100\"\n", + " }\n", + " )\n", + " ]\n", + " )\n", + " )" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Start the training job" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "estimator.fit(wait=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Get S3 location of tensors\n", + "\n", + "We can get information related to the training job:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "job_name = estimator.latest_training_job.name\n", + "\n", + "client = estimator.sagemaker_session.sagemaker_client\n", + "\n", + "description = client.describe_training_job(TrainingJobName=job_name)\n", + "\n", + "print('downloading tensors from training job: ', job_name)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can retrieve the S3 location of the tensors by accessing the dictionary `description`:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "path = description['DebugHookConfig']['S3OutputPath'] + '/' + job_name + '/debug-output'\n", + "\n", + "print('Tensors are stored in: ', path)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Download tensors from S3\n", + "\n", + "Now we will download the tensors from S3, so that we can visualize them in our notebook." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "folder_name = \" /tmp/{}\".format(path.split(\"/\")[-1])\n", + "os.system(\"aws s3 cp --recursive {} {}\".format(path,folder_name))\n", + "print('Downloading tensors into folder: ', folder_name)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we have obtained the tensors from our training job, it is time to plot the distribution of different layers. \n", + "In the following sections we will use Amazon SageMaker Debugger and custom rules to retrieve certain tensors. Typically, rules are supposed to return True or False. However in this notebook we will use custom rules to return dictionaries of aggregated tensors per layer and step, which we then plot afterwards.\n", + "\n", + "### Activation outputs\n", + "This rule will use Amazon SageMaker Debugger to retrieve tensors from the ReLU output layers. It sums the activations across batch and steps. If there is a large fraction of ReLUs outputing 0 across many steps it means that the neuron is dying." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "from smdebug.trials import create_trial\n", + "from smdebug.rules.rule_invoker import invoke_rule\n", + "from smdebug.exceptions import NoMoreData\n", + "from smdebug.rules.rule import Rule\n", + "import numpy as np\n", + "import utils\n", + "import collections\n", + "import os\n", + "from IPython.display import Image" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "class ActivationOutputs(Rule):\n", + " def __init__(self, base_trial):\n", + " super().__init__(base_trial) \n", + " self.tensors = collections.OrderedDict() \n", + " \n", + " def invoke_at_step(self, step):\n", + " for tname in self.base_trial.tensor_names(regex='.*relu_output'):\n", + " if \"gradients\" not in tname:\n", + " try:\n", + " tensor = self.base_trial.tensor(tname).value(step)\n", + " if tname not in self.tensors:\n", + " self.tensors[tname] = collections.OrderedDict()\n", + " if step not in self.tensors[tname]:\n", + " self.tensors[tname][step] = 0\n", + " neg_values = np.where(tensor <= 0)[0]\n", + " if len(neg_values) > 0:\n", + " self.logger.info(f\" Step {step} tensor {tname} has {len(neg_values)/tensor.size*100}% activation outputs which are smaller than 0 \")\n", + " batch_over_sum = np.sum(tensor, axis=0)/tensor.shape[0]\n", + " self.tensors[tname][step] += batch_over_sum\n", + " except:\n", + " self.logger.warning(f\"Can not fetch tensor {tname}\")\n", + " return False\n", + "\n", + "trial = create_trial(folder_name)\n", + "rule = ActivationOutputs(trial)\n", + "try:\n", + " invoke_rule(rule)\n", + "except NoMoreData:\n", + " print('The training has ended and there is no more data to be analyzed. This is expected behavior.')\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Plot the histograms" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "utils.create_interactive_matplotlib_histogram(rule.tensors, filename='images/activation_outputs.gif')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Image(url='images/activation_outputs.gif')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Activation Inputs\n", + "In this rule we look at the inputs into activation function, rather than the output. This can be helpful to understand if there are extreme negative or positive values that saturate the activation functions. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "class ActivationInputs(Rule):\n", + " def __init__(self, base_trial):\n", + " super().__init__(base_trial) \n", + " self.tensors = collections.OrderedDict() \n", + " \n", + " def invoke_at_step(self, step):\n", + " for tname in self.base_trial.tensor_names(regex='.*relu_input'):\n", + " if \"gradients\" not in tname:\n", + " try:\n", + " tensor = self.base_trial.tensor(tname).value(step)\n", + " if tname not in self.tensors:\n", + " self.tensors[tname] = {}\n", + " if step not in self.tensors[tname]:\n", + " self.tensors[tname][step] = 0\n", + " neg_values = np.where(tensor <= 0)[0]\n", + " if len(neg_values) > 0:\n", + " self.logger.info(f\" Tensor {tname} has {len(neg_values)/tensor.size*100}% activation inputs which are smaller than 0 \")\n", + " batch_over_sum = np.sum(tensor, axis=0)/tensor.shape[0]\n", + " self.tensors[tname][step] += batch_over_sum\n", + " except:\n", + " self.logger.warning(f\"Can not fetch tensor {tname}\")\n", + " return False\n", + "\n", + "trial = create_trial(folder_name)\n", + "rule = ActivationInputs(trial)\n", + "try:\n", + " invoke_rule(rule)\n", + "except NoMoreData:\n", + " print('The training has ended and there is no more data to be analyzed. This is expected behavior.')\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Plot the histograms" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "utils.create_interactive_matplotlib_histogram(rule.tensors, filename='images/activation_inputs.gif')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that second convolutional layer `conv1_relu_input_0` receives only negative input values, which means that all ReLUs in this layer output 0." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Image(url='images/activation_inputs.gif')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Gradients\n", + "The following code retrieves the gradients and plots their distribution. If variance is tiny, that means that the model parameters do not get updated effectively with each training step or that the training has converged to a minimum." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "class GradientsLayer(Rule):\n", + " def __init__(self, base_trial):\n", + " super().__init__(base_trial) \n", + " self.tensors = collections.OrderedDict() \n", + " \n", + " def invoke_at_step(self, step):\n", + " for tname in self.base_trial.tensor_names(regex='.*gradient'):\n", + " try:\n", + " tensor = self.base_trial.tensor(tname).value(step)\n", + " if tname not in self.tensors:\n", + " self.tensors[tname] = {}\n", + "\n", + " self.logger.info(f\" Tensor {tname} has gradients range: {np.min(tensor)} {np.max(tensor)} \")\n", + " self.tensors[tname][step] = tensor\n", + " except:\n", + " self.logger.warning(f\"Can not fetch tensor {tname}\")\n", + " return False\n", + "\n", + "trial = create_trial(folder_name)\n", + "rule = GradientsLayer(trial)\n", + "try:\n", + " invoke_rule(rule)\n", + "except NoMoreData:\n", + " print('The training has ended and there is no more data to be analyzed. This is expected behavior.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Plot the histograms" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "utils.create_interactive_matplotlib_histogram(rule.tensors, filename='images/gradients.gif')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Image(url='images/gradients.gif')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Check variance across layers\n", + "The rule retrieves gradients, but this time we compare variance of gradient distribution across layers. We want to identify if there is a large difference between the min and max variance per training step. For instance, very deep neural networks may suffer from vanishing gradients the deeper we go. By checking this ratio we can determine if we run into such a situation." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "class GradientsAcrossLayers(Rule):\n", + " def __init__(self, base_trial, ):\n", + " super().__init__(base_trial) \n", + " self.tensors = collections.OrderedDict() \n", + " \n", + " def invoke_at_step(self, step):\n", + " for tname in self.base_trial.tensor_names(regex='.*gradient'):\n", + " try:\n", + " tensor = self.base_trial.tensor(tname).value(step)\n", + " if step not in self.tensors:\n", + " self.tensors[step] = [np.inf, 0]\n", + " variance = np.var(tensor.flatten())\n", + " if variance < self.tensors[step][0]:\n", + " self.tensors[step][0] = variance\n", + " elif variance > self.tensors[step][1]:\n", + " self.tensors[step][1] = variance \n", + " self.logger.info(f\" Step {step} current ratio: {self.tensors[step][0]} {self.tensors[step][1]} Ratio: {self.tensors[step][1] / self.tensors[step][0]}\") \n", + " except:\n", + " self.logger.warning(f\"Can not fetch tensor {tname}\")\n", + " return False\n", + "\n", + "trial = create_trial(folder_name)\n", + "rule = GradientsAcrossLayers(trial)\n", + "try:\n", + " invoke_rule(rule)\n", + "except NoMoreData:\n", + " print('The training has ended and there is no more data to be analyzed. This is expected behavior.')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's check min and max values of the gradients across layers:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "for step in rule.tensors:\n", + " print(\"Step\", step, \"variance of gradients: \", rule.tensors[step][0], \" to \", rule.tensors[step][1])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Distribution of weights\n", + "This rule retrieves the weight tensors and checks the variance. If the distribution does not change much across steps it may indicate that the learning rate is too low, that gradients are too small or that the training has converged to a minimum." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "class WeightRatio(Rule):\n", + " def __init__(self, base_trial, ):\n", + " super().__init__(base_trial) \n", + " self.tensors = collections.OrderedDict() \n", + " \n", + " def invoke_at_step(self, step):\n", + " for tname in self.base_trial.tensor_names(regex='.*weight'):\n", + " if \"gradient\" not in tname:\n", + " try:\n", + " tensor = self.base_trial.tensor(tname).value(step)\n", + " if tname not in self.tensors:\n", + " self.tensors[tname] = {}\n", + " \n", + " self.logger.info(f\" Tensor {tname} has weights with variance: {np.var(tensor.flatten())} \")\n", + " self.tensors[tname][step] = tensor\n", + " except:\n", + " self.logger.warning(f\"Can not fetch tensor {tname}\")\n", + " return False\n", + "\n", + "trial = create_trial(folder_name)\n", + "rule = WeightRatio(trial)\n", + "try:\n", + " invoke_rule(rule)\n", + "except NoMoreData:\n", + " print('The training has ended and there is no more data to be analyzed. This is expected behavior.')\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Plot the histograms" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "utils.create_interactive_matplotlib_histogram(rule.tensors, filename='images/weights.gif')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Image(url='images/weights.gif')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Inputs\n", + "\n", + "This rule retrieves layer inputs excluding activation inputs." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "class Inputs(Rule):\n", + " def __init__(self, base_trial, ):\n", + " super().__init__(base_trial) \n", + " self.tensors = collections.OrderedDict() \n", + " \n", + " def invoke_at_step(self, step):\n", + " for tname in self.base_trial.tensor_names(regex='.*input'):\n", + " if \"relu\" not in tname:\n", + " try:\n", + " tensor = self.base_trial.tensor(tname).value(step)\n", + " if tname not in self.tensors:\n", + " self.tensors[tname] = {}\n", + " \n", + " self.logger.info(f\" Tensor {tname} has inputs with variance: {np.var(tensor.flatten())} \")\n", + " self.tensors[tname][step] = tensor\n", + " except:\n", + " self.logger.warning(f\"Can not fetch tensor {tname}\")\n", + " return False\n", + "\n", + "trial = create_trial(folder_name)\n", + "rule = Inputs(trial)\n", + "try:\n", + " invoke_rule(rule)\n", + "except NoMoreData:\n", + " print('The training has ended and there is no more data to be analyzed. This is expected behavior.')\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Plot the histograms" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "utils.create_interactive_matplotlib_histogram(rule.tensors, filename='images/layer_inputs.gif')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Image(url='images/layer_inputs.gif')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Layer outputs\n", + "This rule retrieves outputs of layers excluding activation outputs." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "class Outputs(Rule):\n", + " def __init__(self, base_trial, ):\n", + " super().__init__(base_trial) \n", + " self.tensors = collections.OrderedDict() \n", + " \n", + " def invoke_at_step(self, step):\n", + " for tname in self.base_trial.tensor_names(regex='.*output'):\n", + " if \"relu\" not in tname:\n", + " try:\n", + " tensor = self.base_trial.tensor(tname).value(step)\n", + " if tname not in self.tensors:\n", + " self.tensors[tname] = {}\n", + " \n", + " self.logger.info(f\" Tensor {tname} has inputs with variance: {np.var(tensor.flatten())} \")\n", + " self.tensors[tname][step] = tensor\n", + " except:\n", + " self.logger.warning(f\"Can not fetch tensor {tname}\")\n", + " return False\n", + "\n", + "trial = create_trial(folder_name)\n", + "rule = Outputs(trial)\n", + "try:\n", + " invoke_rule(rule)\n", + "except NoMoreData:\n", + " print('The training has ended and there is no more data to be analyzed. This is expected behavior.')\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Plot the histograms" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "utils.create_interactive_matplotlib_histogram(rule.tensors, filename='images/layer_outputs.gif')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "jupyter": { + "outputs_hidden": true + } + }, + "outputs": [], + "source": [ + "Image(url='images/layer_outputs.gif')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Comparison \n", + "In the previous section we have looked at the distribution of gradients, activation outputs and weights of a model that has not trained well due to poor initialization. Now we will compare some of these distributions with a model that has been well intialized." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "entry_point_script = 'mnist.py'\n", + "hyperparameters = {'lr': 0.01}" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator = MXNet(role=sagemaker.get_execution_role(),\n", + " base_job_name='mxnet',\n", + " train_instance_count=1,\n", + " train_instance_type='ml.m5.xlarge',\n", + " train_volume_size=400,\n", + " source_dir='src',\n", + " entry_point=entry_point_script,\n", + " hyperparameters=hyperparameters,\n", + " framework_version='1.6.0',\n", + " py_version='py3',\n", + " debugger_hook_config = DebuggerHookConfig(\n", + " s3_output_path=s3_bucket_for_tensors, \n", + " collection_configs=[\n", + " CollectionConfig(\n", + " name=\"all\",\n", + " parameters={\n", + " \"include_regex\": \".*\",\n", + " \"save_interval\": \"100\"\n", + " }\n", + " )\n", + " ]\n", + " )\n", + " )\n", + " " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Start the training job" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator.fit(wait=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Get S3 path where tensors have been stored" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "job_name = estimator.latest_training_job.name\n", + "client = estimator.sagemaker_session.sagemaker_client\n", + "description = client.describe_training_job(TrainingJobName=job_name)\n", + "path = description['DebugHookConfig']['S3OutputPath'] + '/' + job_name + '/debug-output'\n", + "print('Tensors are stored in: ', path)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Download tensors from S3" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "folder_name2 = \"/tmp/{}_2\".format(path.split(\"/\")[-1])\n", + "os.system(\"aws s3 cp --recursive {} {}\".format(path,folder_name2))\n", + "print('Downloading tensors into folder: ', folder_name2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Gradients\n", + "\n", + "Lets compare distribution of gradients of the convolutional layers of both trials." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "trial = create_trial(folder_name)\n", + "rule = GradientsLayer(trial)\n", + "try:\n", + " invoke_rule(rule)\n", + "except NoMoreData:\n", + " print('The training has ended and there is no more data to be analyzed. This is expected behavior.')\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dict_gradients = {}\n", + "dict_gradients['gradient/conv0_weight_bad_hyperparameters'] = rule.tensors['gradient/conv0_weight']\n", + "dict_gradients['gradient/conv1_weight_bad_hyperparameters'] = rule.tensors['gradient/conv1_weight']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Second trial:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "trial = create_trial(folder_name2)\n", + "rule = GradientsLayer(trial)\n", + "try:\n", + " invoke_rule(rule)\n", + "except NoMoreData:\n", + " print('The training has ended and there is no more data to be analyzed. This is expected behavior.')\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dict_gradients['gradient/conv0_weight_good_hyperparameters'] = rule.tensors['gradient/conv0_weight']\n", + "dict_gradients['gradient/conv1_weight_good_hyperparameters'] = rule.tensors['gradient/conv1_weight']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Plot the histograms" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "utils.create_interactive_matplotlib_histogram(dict_gradients, filename='images/gradients_comparison.gif')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In the case of the poorly initalized model, gradients are fluctuating a lot leading to very high variance. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Image(url='images/gradients_comparison.gif')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Activation inputs\n", + "\n", + "Lets compare distribution of activation inputs of both trials." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "trial = create_trial(folder_name)\n", + "rule = ActivationInputs(trial)\n", + "try:\n", + " invoke_rule(rule)\n", + "except NoMoreData:\n", + " print('The training has ended and there is no more data to be analyzed. This is expected behavior.')\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dict_activation_inputs = {}\n", + "dict_activation_inputs['conv0_relu_input_0_bad_hyperparameters'] = rule.tensors['conv0_relu_input_0']\n", + "dict_activation_inputs['conv1_relu_input_0_bad_hyperparameters'] = rule.tensors['conv1_relu_input_0']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Second trial" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "trial = create_trial(folder_name2)\n", + "rule = ActivationInputs(trial)\n", + "try:\n", + " invoke_rule(rule)\n", + "except NoMoreData:\n", + " print('The training has ended and there is no more data to be analyzed. This is expected behavior.')\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "dict_activation_inputs['conv0_relu_input_0_good_hyperparameters'] = rule.tensors['conv0_relu_input_0']\n", + "dict_activation_inputs['conv1_relu_input_0_good_hyperparameters'] = rule.tensors['conv1_relu_input_0']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Plot the histograms" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "utils.create_interactive_matplotlib_histogram(dict_activation_inputs, filename='images/activation_inputs_comparison.gif')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The distribution of activation inputs into first activation layer `conv0_relu_input_0` look quite similar in both trials. However in the case of the second layer they drastically differ. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "Image(url='images/activation_inputs_comparison.gif')" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.4" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/sagemaker-debugger/mnist_tensor_analysis/src/mnist.py b/sagemaker-debugger/mnist_tensor_analysis/src/mnist.py new file mode 100644 index 0000000000..6538c6eacb --- /dev/null +++ b/sagemaker-debugger/mnist_tensor_analysis/src/mnist.py @@ -0,0 +1,90 @@ +# Standard Library +import argparse + +# Third Party +import mxnet as mx +import numpy as np +from mxnet import autograd, gluon, init +from mxnet.gluon import nn +from mxnet.gluon.data.vision import datasets, transforms +import os +import time + + +def parse_args(): + parser = argparse.ArgumentParser(description="Train a mxnet gluon model") + parser.add_argument( + "--output-s3-uri", + type=str, + default=None, + help="S3 URI of the bucket where tensor data will be stored.", + ) + parser.add_argument( + "--smdebug_path", + type=str, + default=None, + help="S3 URI of the bucket where tensor data will be stored.", + ) + parser.add_argument("--initializer", type=int, default=1, help="Variable to change intializer") + parser.add_argument("--lr", type=float, default=0.001, help="Variable to change learning rate") + opt = parser.parse_args() + return opt + + +def create_gluon_model(initializer): + net = nn.HybridSequential() + net.add( + nn.Conv2D(channels=6, kernel_size=5, activation="relu"), + nn.MaxPool2D(pool_size=2, strides=2), + nn.Conv2D(channels=16, kernel_size=3, activation="relu"), + nn.MaxPool2D(pool_size=2, strides=2), + nn.Flatten(), + nn.Dense(120, activation="relu"), + nn.Dense(84, activation="relu"), + nn.Dense(10), + ) + if initializer == 1: + net.initialize(init=init.Xavier(), ctx=mx.cpu()) + elif initializer == 2: + # variance will not remain the same across layers + net.initialize(init=init.Uniform(1), ctx=mx.cpu()) + else: + # does not break symmetry,so gradients will not differ much + net.initialize(init=init.Uniform(0.0001), ctx=mx.cpu()) + return net + + +def train_model(batch_size, net, train_data, lr): + softmax_cross_entropy = gluon.loss.SoftmaxCrossEntropyLoss() + trainer = gluon.Trainer(net.collect_params(), "sgd", {"learning_rate": lr}) + for epoch in range(3): + for data, label in train_data: + data = data.as_in_context(mx.cpu(0)) + with autograd.record(): + output = net(data) + loss = softmax_cross_entropy(output, label) + loss.backward() + trainer.step(batch_size) + print(np.mean(loss.asnumpy())) + +def prepare_data(batch_size): + mnist_train = datasets.FashionMNIST(train=True) + transformer = transforms.Compose([transforms.ToTensor(), transforms.Normalize(0.286, 0.352)]) + mnist_train = mnist_train.transform_first(transformer) + train_data = gluon.data.DataLoader( + mnist_train, batch_size=batch_size, shuffle=True, num_workers=4 + ) + + return train_data + + +def main(): + opt = parse_args() + net = create_gluon_model(opt.initializer) + train_data = prepare_data(128) + train_model(128, net, train_data, opt.lr) + + +if __name__ == "__main__": + main() + diff --git a/sagemaker-debugger/mnist_tensor_analysis/utils.py b/sagemaker-debugger/mnist_tensor_analysis/utils.py new file mode 100644 index 0000000000..aaa2654398 --- /dev/null +++ b/sagemaker-debugger/mnist_tensor_analysis/utils.py @@ -0,0 +1,137 @@ +import numpy as np +import math +import matplotlib.pyplot as plt +from matplotlib.animation import FuncAnimation +plt.rcParams.update({'font.size': 8}) + +#create slider and updatemenues for each training step +def create_slider(steps): + updatemenus = [dict(type='buttons', + direction= 'left', + pad=dict(r= 10, t=85), + showactive = True, + x= 0.1, + y= 0, + xanchor= 'right', + yanchor= 'top', + buttons=[dict(label='Play', + method='animate', + args=[[f'{k}' for k in range(steps)], + dict(frame=dict(duration=100, redraw=True), + transition=dict(duration=300), + easing='linear', + fromcurrent=True, + mode='immediate')])]) + ] + + sliders = [{'yanchor': 'top', + 'xanchor': 'left', + 'currentvalue': {'font': {'size': 16}, + 'prefix': 'Step: ', + 'visible': True, + 'xanchor': 'right'}, + 'transition': {'duration': 500.0, + 'easing': 'linear'}, + 'pad': {'b': 10, 't': 50}, + 'len': 0.9, 'x': 0.1, 'y': 0, + 'steps': [{'args': [[k], {'frame': {'duration': 500.0, + 'easing': 'linear', + 'redraw': True}, + 'transition': {'duration': 0, + 'easing': 'linear'}}], + 'label': k, + 'method': 'animate'} + for k in range(steps) + ]}] + + return updatemenus, sliders + +# create animated histograms with plotly +def create_interactive_plotly_histogram(tensors): + import plotly.graph_objects as go + from plotly.subplots import make_subplots + tname = list(tensors.keys())[0] + steps = list(tensors[tname].keys()) + nrows = math.ceil(len(tensors.keys())/2) + + #create plot for each layer in the neural network + fig = make_subplots(rows=nrows, + cols=2, + horizontal_spacing = 0.05, + vertical_spacing = 0.1, + subplot_titles = (list(tensors.keys()))) + + #plot histograms for training step 0 + row,col = 1,1 + for tname in tensors: + x = tensors[tname][steps[0]].flatten() + + fig.add_trace(go.Histogram(x = x, nbinsx = 100), row, col) + if col >= 2: + row += 1 + col = 0 + col += 1 + + # Set frames for each training step + frames = [] + for idx,step in enumerate(steps): + frame = {'data': [], + 'name': str(idx), + 'traces': np.arange(len(tensors.keys()))} + for tname in tensors: + x = tensors[tname][step].flatten() + print(np.min(x), np.max(x), ) + frame['data'].append(go.Histogram(x = x, nbinsx=100)) + frames.append(frame) + + #create slider and updatemenue + updatemenus, sliders = create_slider(len(steps)) + + #set frames and update layout + fig.update(frames=frames) + fig.update_layout(width=1000, height=nrows*400, + showlegend=False, + plot_bgcolor='rgba(0,0,0,0)', + updatemenus=updatemenus, + sliders=sliders) + + return fig.show(renderer="iframe") + + + +# create animated histograms with matplotlib +def create_interactive_matplotlib_histogram(tensors, filename="data/animation.gif"): + nrows = math.ceil(len(tensors.keys())/2) + if nrows == 1: + nrows = 2 + fig, axes = plt.subplots(nrows, 2, figsize=(15, nrows*5)) + plt.subplots_adjust(wspace = 0.5, hspace = 0.3) + tname = list(tensors.keys())[0] + steps = list(tensors[tname].keys()) + + #function that defines the data for the different frames + def animate(frame): + row,col = 0,0 + for tname in tensors: + + #get new data for histogram + z = tensors[tname][steps[frame]] + neg_values = np.where(z <= 0)[0] + if col > 1: + row += 1 + col = 0 + + #clear previous histogram data + axes[row,col].clear() + + #set title and new histogram data + axes[row,col].set_title( "{} \n Step {} : {:.0f}% of values below 0 - variance : {:.2f}".format( + tname, steps[frame], (len(neg_values)/z.size)*100, np.var(z))) + axes[row,col].hist(z.flatten(),bins=100) + col += 1 + + simulation = FuncAnimation(fig, animate, frames=len(steps), interval=1, repeat=False) + simulation.save(filename, writer='pillow', fps=5) + fig.tight_layout() + plt.close() + \ No newline at end of file diff --git a/sagemaker-debugger/mxnet_realtime_analysis/mxnet-realtime-analysis.ipynb b/sagemaker-debugger/mxnet_realtime_analysis/mxnet-realtime-analysis.ipynb new file mode 100644 index 0000000000..a237206d11 --- /dev/null +++ b/sagemaker-debugger/mxnet_realtime_analysis/mxnet-realtime-analysis.ipynb @@ -0,0 +1,428 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Debugging Amazon SageMaker training jobs In real time with Debugger" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Overview" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Debugger is a new capability of Amazon SageMaker that allows debugging machine learning training. \n", + "It lets you go beyond just looking at scalars like losses and accuracies during training and gives you full visibility into all tensors 'flowing through the graph' during training. Debugger helps you to monitor your training in near real time using rules and would provide you alerts, once it has detected inconsistency in training flow.\n", + "\n", + "Using Debugger is a two step process: Saving tensors and Analysis. Let's look at each one of them closely.\n", + "\n", + "### Saving tensors\n", + "\n", + "Tensors define the state of the training job at any particular instant in its lifecycle. Debugger exposes a library which allows you to capture these tensors and save them for analysis.\n", + "\n", + "### Analysis\n", + "\n", + "There are two ways to get to tensors and run analysis on them. One way is to use concept called ***Rules***. For more information about a rules-based approach to analysis, see [Rules](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/analysis.md#Rules). The focus of this notebook is on another way of analysis: **Manual**.\n", + "\n", + "Manual analysis is what you use when there are no rules available to detect type of an issue you are running into and you need to get to raw tensors in order to understand what data is travelling through your model duing training and, hopefully, root cause a problem or two with your training job.\n", + "\n", + "Manual analysis is powered by Debugger API - a framework that allows to retrieve tensors and scalas (e.g. debugging data) saved during training job via few lines of code. One of the most powerful features provided by it is real time access to data - you can get tensors and scalars ***while your training job is running***.\n", + "\n", + "This example guides you through installing required components for emitting tensors in an Amazon SageMaker training job and using Debugger API to access those tensors while training is running. Use a small gluon CNN model and train it on the FashionMNIST dataset. While the job is running, you retrieve activations of the first convolutional layer from each of 100 batches and visualize them. Also we will visualize weights of that level after the job is done.\n", + "\n", + "Before we proceed let us install the smdebug binary which allows us to perform interactive analysis in this notebook. After installing it, please restart the kernel, and when you come back skip this cell.\n", + "\n", + "### Installing smdebug\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + " !python -m pip install smdebug" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training MXNet models in Amazon SageMaker with Debugger\n", + "\n", + "Train a small MXNet CNN model with the FashonMNIST dataset in this notebook, with Debugger enabled. This is done using an Amazon SageMaker MXNet 1.6.0 container with script mode. Debugger currently works with Python3, so be sure to set `py_version='py3'` when creating the Amazon SageMaker Estimator.\n", + "\n", + "First, train a simple training script mnist_gluon_realtime_visualize_demo.py with Debugger enabled in Amazon SageMaker using the Amazon SageMaker Estimator API. In this example, for simplicity sake, Debugger captures all tensors as specified in its configuration every 100 steps (one step is one batch). While training job is running, use Debugger API to access saved tensors in real time and visualize them. Rely on Debugger to take care of downloading fresh set of tensors every time you query for them.\n", + "\n", + "## Enable Debugger in Estimator object\n", + "\n", + "Enabling Debugger in training job can be accomplished by adding its configuration into Estimator object constructor:\n", + "\n", + "```\n", + "sagemaker_simple_estimator = MXNet(...,\n", + " debugger_hook_config = DebuggerHookConfig(\n", + " s3_output_path=\"s3://{bucket_name}/{location_in_bucket}\", # Required\n", + " collection_configs=[\n", + " CollectionConfig(\n", + " name=\"conv0_tensors\",\n", + " parameters={\n", + " \"include_regex\": \"conv0.*\",\n", + " \"save_interval\": \"100\"\n", + " }\n", + " )\n", + " ]\n", + " )\n", + ")\n", + "```\n", + "Consider this almost \"magical\" config object. Its purpose is to instruct Estimator (and CreateTrainingJob API method that is called) with what debugging data you are interested in for the debugging and visualization exercise. Here are two parameters: \n", + "- `s3_output_path`: it points to S3 bucket where you intend to store the debugging tensors. The amount of data saved depends on multiple factors. Major ones are training job, dataset, model, and frequency of saving tensors. This bucket should be in your AWS account and you have full access control over it. **Important**: This S3 bucket should be originally created in the same Region where your training job will be running, otherwise you might run into problems with cross-Region access.\n", + "- `collection_configs` enumerates named collections of tensors to save. Collections are a convenient way to organize relevant tensors under same umbrella to make it easy to navigate them during analysis. In this particular case create a single collection with name 'conv0_tensors' and ask it to save all tensors whose name matches 'conv0.\\*' regex. You know this name based on the structure of the model defined in [model training script](./scripts/mnist_gluon_realtime_visualize_demo.py). You also instructed Debugger to save tensors every 100 steps, where one step is one batch during a training job. Also, see [Collection](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/API.md#collection) documentation for all parameters that are supported by Collections and DebuggerConfig" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Configuring the inputs for the training job\n", + "\n", + "Now call the Amazon SageMaker MXNet Estimator to kick off a training job along with enabling Debugger functionality.\n", + "\n", + "- `entry_point_script` points to the simple MXNet training script that is ran by training job\n", + "- `hyperparameters` are the parameters that will be passed to the training script\n", + "- `train_volume_size` with value *400* ensures enough EBS volume is provisioned to collect tensors emitted by the training job" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%load_ext autoreload\n", + "%autoreload 2\n", + "\n", + "import sagemaker\n", + "import boto3\n", + "import os\n", + "import sagemaker\n", + "\n", + "from sagemaker.mxnet import MXNet\n", + "from sagemaker.debugger import rule_configs, DebuggerHookConfig, CollectionConfig\n", + "from smdebug.mxnet import modes\n", + "\n", + "sagemaker_session = sagemaker.Session()\n", + "\n", + "entry_point_script = './scripts/mnist_gluon_realtime_visualize_demo.py'\n", + "hyperparameters = {'batch-size': 256, 'learning_rate': 0.1, 'epochs': 10}\n", + "base_job_name = 'mxnet-realtime-analysis-example'\n", + "\n", + "# Make sure to set this to your bucket and location\n", + "BUCKET_NAME = sagemaker_session.default_bucket()\n", + "LOCATION_IN_BUCKET = 'smdebug-real-time-demo'\n", + "\n", + "s3_bucket_for_tensors = 's3://{BUCKET_NAME}/{LOCATION_IN_BUCKET}'.format(BUCKET_NAME=BUCKET_NAME, LOCATION_IN_BUCKET=LOCATION_IN_BUCKET)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sagemaker_simple_estimator = MXNet(\n", + " role=sagemaker.get_execution_role(),\n", + " base_job_name=base_job_name,\n", + " train_instance_count=1,\n", + " train_instance_type='ml.m4.xlarge',\n", + " train_volume_size=400,\n", + " entry_point=entry_point_script,\n", + " hyperparameters=hyperparameters,\n", + " framework_version='1.6.0',\n", + " py_version='py3',\n", + " train_max_run=3600,\n", + " sagemaker_session=sagemaker_session,\n", + " debugger_hook_config = DebuggerHookConfig(\n", + " s3_output_path=s3_bucket_for_tensors, # Required\n", + " collection_configs=[\n", + " CollectionConfig(\n", + " name=\"conv0_tensors\",\n", + " parameters={\n", + " \"include_regex\": \"conv0.*\",\n", + " \"save_interval\": \"100\"\n", + " }\n", + " )\n", + " ]\n", + " )\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the next step, start a training job using the Estimator object you created above. This job is started in an asynchronous, non-blocking way. This means that control is passed back to notebook and further commands are run while training job is progressing." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# This is a fire and forget event. By setting wait=False, we just submit the job to run in the background.\n", + "# SageMaker will spin off one training job and release control to next cells in the notebook.\n", + "# Please follow this notebook to see status of the training job.\n", + "sagemaker_simple_estimator.fit(wait=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "### Result\n", + "\n", + "As a result of the above command, Amazon SageMaker starts one training job for you and it produces the tensors to be analyzed. This job runs in a background without you having to wait for it to complete in order to continue with the rest of the notebook. Because of this asynchronous nature of the training job, monitor its status so that you don't start to request debugging tensors too early. Tensors are only produced during training phase of the Amazon SageMaker training job, so wait until that begins.\n", + "\n", + "## Analysis and Visualization\n", + "\n", + "### Checking on the training job status\n", + "\n", + "Check the status of the training job by running the following code. It checks on the status of an Amazon SageMaker training job every 15 seconds. After a job has started its training cycle, control is released to the next cells in the notebook. That means a training job started to tune the model and, in parallel, emit debugging tensors." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# some helper method first, to render status status updates\n", + "import time\n", + "import sys\n", + "from time import gmtime, strftime\n", + "\n", + "def print_same_line(s):\n", + " sys.stdout.write('\\r{}: {}'.format(strftime('%X', gmtime()), s))\n", + " sys.stdout.flush()\n", + " \n", + "# Below command will give the status of training job\n", + "# Note: In the output of below command you will see DebugConfig parameter \n", + "# which describes what, where and how debugging data is to be collected\n", + "job_name = sagemaker_simple_estimator.latest_training_job.name\n", + "print('Training job name: ' + job_name)\n", + "\n", + "client = sagemaker_simple_estimator.sagemaker_session.sagemaker_client\n", + "\n", + "description = client.describe_training_job(TrainingJobName=job_name)\n", + "\n", + "if description['TrainingJobStatus'] != 'Completed':\n", + " while description['SecondaryStatus'] not in {'Training', 'Completed'}:\n", + " description = client.describe_training_job(TrainingJobName=job_name)\n", + " primary_status = description['TrainingJobStatus']\n", + " secondary_status = description['SecondaryStatus']\n", + " print_same_line('Current job status: [PrimaryStatus: {}, SecondaryStatus: {}]'.format(primary_status, secondary_status))\n", + " time.sleep(15)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Retrieving and Analyzing tensors\n", + "\n", + "Before getting to analysis, here are some notes on concepts being used in Debugger that help with analysis.\n", + "- ***Trial*** - object that is a center piece of Debugger API when it comes to getting access to tensors. It is a top level abstract that represents a single run of a training job. All tensors emitted by training job are associated with its *trial*.\n", + "- ***Step*** - object that represents next level of abstraction. In Debugger - *step* is a representation of a single batch of a training job. Each trial has multiple steps. Each tensor is associated with multiple steps - having a particular value at each of the steps.\n", + "- ***Tensor*** - object that represent actual *tensor* saved during training job. *Note* - it could be a scalar as well (for example, losses are saved as scalars).\n", + "\n", + "For more details on aforementioned concepts as well as on Debugger API in general (including examples) please refer to [Debugger Analysis API](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/analysis.md) documentation.\n", + "\n", + "Below, you can find several methods to help with retrieving and plotting tensors. In *get_data* you use concepts described above to retrieve data. You can expect to get steps_range that has one or more steps (batches) for which you want to get tensors. Two other methods are helpers to plot tensors." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import matplotlib.pyplot as plt\n", + "\n", + "def get_data(trial, tname, batch_index, steps_range, mode=modes.GLOBAL):\n", + " tensor = trial.tensor(tname)\n", + " vals = []\n", + " for s in steps_range:\n", + " val = tensor.value(step_num=s, mode=mode)[batch_index][0]\n", + " vals.append(val)\n", + " return vals\n", + "\n", + "def create_plots(steps_range):\n", + " fig, axs = plt.subplots(nrows=1, ncols=len(steps_range), constrained_layout=True, figsize=(2*len(steps_range), 2),\n", + " subplot_kw={'xticks': [], 'yticks': []})\n", + " return fig, axs\n", + "\n", + "def plot_tensors(trial, layer, batch_index, steps_range):\n", + " if len(steps_range) > 0: \n", + " fig, axs = create_plots(steps_range)\n", + " vals = get_data(trial, layer, batch_index, steps_range)\n", + "\n", + " for ax, image, step in zip(axs.flat if isinstance(axs, np.ndarray) else np.array([axs]), vals, steps_range):\n", + " ax.imshow(image, cmap='gray')\n", + " ax.set_title(str(step))\n", + " plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that you are prepared with methods to get data and plot it, get to it. The goal of the next block is to instantiate a ***Trial***, a central access point for all Debugger API calls to get tensors. Do that by inspecting currently running training job and extracting necessary parameters from its debug config to instruct Debugger where the data you are looking for is located. Note:\n", + "- Tensors are being stored in your own S3 bucket to which you can navigate and manually inspect its content if desired.\n", + "- You might notice a slight delay before trial object is created. It is normal as Debugger will monitor corresponding bucket with tensors and wait until tensors appear in it. The delay is introduced by less than instantaneous upload of tensors from training container to your S3 bucket. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import os\n", + "from urllib.parse import urlparse\n", + "from smdebug.trials import create_trial\n", + "\n", + "# this is where we create a Trial object that allows access to saved tensors\n", + "trial = create_trial(sagemaker_simple_estimator.get_debugger_artifacts_path())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Using the next command you can conveniently inspect all tensors that are produced by a model and saved by Debugger. You can do that easily because you put them under the umbrella of one single collection." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# inspect tensors saved in conv0_tensors collection (for conv0 layer of our model)\n", + "trial.tensor_names(collection=\"conv0_tensors\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Visualize tensors of a running training job\n", + "Below you wait until Debugger has downloaded initial chunk of tensors to look at. Once that first chunk is ready, you get new chunks every 5 seconds, and can plot their tensors correspondingly one under another." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Below we select the very first tensor from every batch.\n", + "# Feel free to modify this and select another tensor from the batch.\n", + "batch_index = 0\n", + "\n", + "# This is a name of a tensor to retrieve data of.\n", + "# Variable is called `layer` as this tensor happens to be output of first convolutional layer.\n", + "layer = 'conv0_output_0'\n", + "\n", + "steps = 0\n", + "while steps == 0:\n", + " # trial.steps return all steps that have been downloaded by Debugger to date.\n", + " # It doesn't represent all steps that are to be available once training job is complete -\n", + " # it is a snapshot of a current state of the training job. If you call it after training job is done\n", + " # you will get all tensors available at once.\n", + " steps = trial.steps()\n", + " print_same_line('Waiting for tensors to become available...')\n", + " time.sleep(3)\n", + "print('\\nDone')\n", + "\n", + "print('Getting tensors and plotting...')\n", + "rendered_steps = []\n", + "\n", + "# trial.loaded_all_steps is a way to keep monitoring for a state of a training job as seen by Debugger.\n", + "# When SageMaker completes training job Debugger, and trial, becomes aware of it.\n", + "\n", + "loaded_all_steps = False\n", + "while not loaded_all_steps:\n", + " loaded_all_steps = trial.loaded_all_steps\n", + " steps = trial.steps()\n", + " # quick way to get diff between two lists\n", + " steps_to_render = list(set(steps).symmetric_difference(set(rendered_steps)))\n", + " # plot only tensors from newer chunk\n", + " plot_tensors(trial, layer, batch_index, steps_to_render)\n", + " rendered_steps.extend(steps_to_render)\n", + " time.sleep(5)\n", + "print('\\nDone')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Additional visualizations\n", + "\n", + "Now that you completed plotting tensors showing output of first layer of the model during training job run, plot more tensors! This time you get all of them at once as a training job has finished and Debugger is aware of all tensors emitted by it. You can visualize tensors representing weights of first convolutional layer (e.g., its kernels). By inspecting each row of plotted tensors from left to right you can notice progression in how each kernel was \"learning\" its values. You will most likely notice that most changes in kernels are happening closer to the first steps of training. Closer toward finish of training job updates to kernels become less and less noticeable. This suggests training job is converging. *Note*: convergence doesn't necessarily mean increase in accuracy but it often accommodates that." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Let's visualize weights of the first convolutional layer as they progressively change through training.\n", + "layer = 'conv0_weight'\n", + "\n", + "steps = trial.tensor(layer).steps()\n", + "for i in range(0, trial.tensor(layer).value(step_num=steps[0]).shape[0]):\n", + " plot_tensors(trial, layer, i, trial.tensor(layer).steps())" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.4" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/sagemaker-debugger/mxnet_realtime_analysis/scripts/mnist_gluon_realtime_visualize_demo.py b/sagemaker-debugger/mxnet_realtime_analysis/scripts/mnist_gluon_realtime_visualize_demo.py new file mode 100644 index 0000000000..f8896bfeb0 --- /dev/null +++ b/sagemaker-debugger/mxnet_realtime_analysis/scripts/mnist_gluon_realtime_visualize_demo.py @@ -0,0 +1,122 @@ +# Standard Library +import argparse +import time + +# Third Party +import mxnet as mx +from mxnet import autograd, gluon, init +from mxnet.gluon import nn +from mxnet.gluon.data.vision import datasets, transforms + +def parse_args(): + parser = argparse.ArgumentParser( + description="Train a mxnet gluon model for FashonMNIST dataset" + ) + parser.add_argument("--batch-size", type=int, default=256, help="Batch size") + parser.add_argument( + "--epochs", type=int, default=50, help="Amount of epochs to run training loop" + ) + parser.add_argument("--learning_rate", type=float, default=0.1) + opt = parser.parse_args() + return opt + + +def acc(output, label): + return (output.argmax(axis=1) == label.astype("float32")).mean().asscalar() + + +def train_model(batch_size, net, train_data, valid_data, lr, epochs): + softmax_cross_entropy = gluon.loss.SoftmaxCrossEntropyLoss() + trainer = gluon.Trainer(net.collect_params(), "sgd", {"learning_rate": lr}) + # Start the training. + for epoch in range(epochs): + train_loss, train_acc, valid_acc = 0.0, 0.0, 0.0 + tic = time.time() + for data, label in train_data: + data = data.as_in_context(mx.cpu(0)) + # forward + backward + with autograd.record(): + output = net(data) + loss = softmax_cross_entropy(output, label) + loss.backward() + # update parameters + trainer.step(batch_size) + # calculate training metrics + train_loss += loss.mean().asscalar() + train_acc += acc(output, label) + # calculate validation accuracy + for data, label in valid_data: + data = data.as_in_context(mx.cpu(0)) + valid_acc += acc(net(data), label) + print( + "Epoch %d: loss %.3f, train acc %.3f, test acc %.3f, in %.1f sec" + % ( + epoch, + train_loss / len(train_data), + train_acc / len(train_data), + valid_acc / len(valid_data), + time.time() - tic, + ) + ) + + +def prepare_data(batch_size): + mnist_train = datasets.FashionMNIST(train=True) + X, y = mnist_train[0] + ("X shape: ", X.shape, "X dtype", X.dtype, "y:", y) + text_labels = [ + "t-shirt", + "trouser", + "pullover", + "dress", + "coat", + "sandal", + "shirt", + "sneaker", + "bag", + "ankle boot", + ] + X, y = mnist_train[0:10] + transformer = transforms.Compose([transforms.ToTensor(), transforms.Normalize(0.13, 0.31)]) + mnist_train = mnist_train.transform_first(transformer) + train_data = gluon.data.DataLoader( + mnist_train, batch_size=batch_size, shuffle=True, num_workers=4 + ) + mnist_valid = gluon.data.vision.FashionMNIST(train=False) + valid_data = gluon.data.DataLoader( + mnist_valid.transform_first(transformer), batch_size=batch_size, num_workers=4 + ) + return train_data, valid_data + + +# Create a model using gluon API. +def create_gluon_model(): + # Create Model in Gluon + net = nn.HybridSequential(prefix="sequential_") + net.add( + nn.Conv2D(channels=6, kernel_size=5, activation="relu"), + nn.MaxPool2D(pool_size=2, strides=2), + nn.Conv2D(channels=16, kernel_size=3, activation="relu"), + nn.MaxPool2D(pool_size=2, strides=2), + nn.Flatten(), + nn.Dense(120, activation="relu"), + nn.Dense(84, activation="relu"), + nn.Dense(10), + ) + net.initialize(init=init.Xavier(), ctx=mx.cpu()) + return net + +def main(): + opt = parse_args() + + # Create a Gluon Model. + net = create_gluon_model() + + # Start the training. + batch_size = opt.batch_size + train_data, valid_data = prepare_data(batch_size) + + train_model(batch_size, net, train_data, valid_data, opt.learning_rate, opt.epochs) + +if __name__ == "__main__": + main() diff --git a/sagemaker-debugger/mxnet_spot_training/mxnet-spot-training-with-sagemakerdebugger.ipynb b/sagemaker-debugger/mxnet_spot_training/mxnet-spot-training-with-sagemakerdebugger.ipynb new file mode 100644 index 0000000000..39dd39e1b2 --- /dev/null +++ b/sagemaker-debugger/mxnet_spot_training/mxnet-spot-training-with-sagemakerdebugger.ipynb @@ -0,0 +1,186 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Enable Spot Training with Amazon SageMaker Debugger\n", + "\n", + "Amazon SageMaker Debugger is a new capability of Amazon SageMaker that allows debugging machine learning training. \n", + "It lets you go beyond just looking at scalars like losses and accuracies during training and gives you full visibility into all tensors 'flowing through the graph' during training. Amazon SageMaker Debugger helps you to monitor your training in near real time using rules and would provide you alerts, once it has detected inconsistency in training flow.\n", + "\n", + "Using Amazon SageMaker Debugger is a two step process: Saving tensors and Analysis.\n", + "\n", + "### Saving tensors\n", + "Tensors define the state of the training job at any particular instant in its lifecycle. Debugger exposes a library which allows you to capture these tensors and save them for analysis.\n", + "\n", + "### Analysis\n", + "There are two ways to get to tensors and run analysis on them. One way is to use concept called ***Rules***. For more information about a rules-based approach to analysis, see [Rules](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/analysis.md#Rules). You can also perform interactive analysis in a notebook. Please refer to our other notebooks on how to do that.\n", + "\n", + "## Spot Training\n", + "This notebook talks about how Amazon SageMaker Debugger feature can also be used with Spot Training. For more information related to spot training in Amazon SageMaker please see [Spot Training](https://docs.aws.amazon.com/sagemaker/latest/dg/model-managed-spot-training.html).\n", + "\n", + "The examples uses a small gluon CNN model and trains it on the FashionMNIST dataset. If during the training spot instance terminates, the training and analysis of tensors will continue from the last saved checkpoint." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "import boto3\n", + "import os\n", + "from sagemaker.mxnet import MXNet\n", + "from sagemaker.debugger import Rule, rule_configs\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Configuring the inputs for the training job\n", + "\n", + "Now call the Amazon SageMaker MXNet Estimator to kick off a training job along with enabling Debugger functionality.\n", + "\n", + "- `entrypoint_script` points to the simple MXNet training script that is ran by training job\n", + "- `hyperparameters` are the parameters that will be passed to the training script." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Set the SageMaker Session\n", + "sagemaker_session = sagemaker.Session()\n", + "\n", + "# Define the entrypoint script\n", + "entrypoint_script='mxnet_gluon_spot_training.py'\n", + "hyperparameters = {'batch-size' : 100, 'epochs' : 5, 'checkpoint-path' : '/opt/ml/checkpoints' }\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training MXNet models in Amazon SageMaker with Amazon SageMaker Debugger\n", + "\n", + "Train a small MXNet CNN model with the FashonMNIST dataset in this notebook, with Amazon SageMaker Debugger enabled. This is done using an Amazon SageMaker MXNet 1.6.0 container with script mode. Amazon SageMaker Debugger currently works with Python3, so be sure to set `py_version='py3'` when creating the Amazon SageMaker Estimator.\n", + "\n", + "\n", + "## Enable Amazon SageMaker Debugger and Spot Training in Estimator object\n", + "\n", + "Enabling Amazon SageMaker Debugger in training job can be accomplished by adding its configuration into Estimator object constructor:\n", + "\n", + "```python\n", + "sagemaker_simple_estimator = MXNet(...,\n", + " # Parameters required to enable spot training.\n", + " train_use_spot_instances=True, #Set it to True to enable spot training.\n", + " train_max_wait = 10000 # This should be equal to or greater than train_max_run in seconds'\n", + " checkpoint_local_path = '/opt/ml/checkpoints/' # This is local path where checkpoints will be stored during training. Default path is /opt/ml/checkpoints'.The training script should generate the checkpoints.\n", + " checkpoint_s3_uri = 's3://bucket/prefix' # Uri to S3 bucket where the checkpoints captured by the model will be stored.\n", + " ## Rule Parameter\n", + " rules = [Rule.sagemaker(rule_configs.vanishing_gradient())]\n", + ")\n", + "```\n", + "In this section, we will focus on parameters that are needed to enable Spot Training. \n", + "\n", + "- `train_use_spot_instance` : This parameter should be set to 'True' to enable the spot training.\n", + "- `train_max_wait` : This parameter (in seconds) should be set equal to or greater than 'train_max_run'. \n", + "- `checkpoint_s3_uri` : This is URI to S3 bucket where the checkpoints will be stored before the spot instance terminated. Once the training is resumed, the checkpoints from this S3 bucket will be restored to 'checkpoint_local_path' in the new instance. Ensure that the S3 bucket is created in the same region as that of current session.\n", + "- `checkpoint_local_path`: This is the local path where the model will save the checkpoints perodically. The default path is set to '/opt/ml/checkpoints'. Ensure that the model under training is saving the checkpoints in this path. Note that in hyperparameters we are setting 'checkpoint-path' so that the training script will save the checkpoints in that directory.\n", + "\n", + "\n", + "### Rule Parameter\n", + "We are going to run the *vanishing_gradient* rule during this training. By specifying this parameter, we are enabling the Amazon SageMaker Debugger functionality to collect the *gradients* during this training. The *gradients* will be collected every 500th step as part of the default configurations for this Rule.\n", + "\n", + "\n", + "## How Spot Training works with Amazon SageMaker Debugger\n", + "\n", + "Amazon SageMaker Debugger can be enabled even for training with Spot Instances. Spot instances can be interrupted, causing jobs to take longer to start or finish. To leverage the managed spot instance support that Amazon SageMaker provides, you need to configure your training job to save checkpoints. Amazon SageMaker copies checkpoint data from a local path to Amazon S3. When the job is restarted on a different instance, Amazon SageMaker copies the data from Amazon S3 back into the local path. The training can then resume from the last checkpoint instead of restarting.\n", + "\n", + "Amazon SageMaker Debugger relies on the checkpoints mechanism to continue emitting tensors from the last saved checkpoint. The Amazon SageMaker Debugger saves the metadata containing last saved state whenver user creates a checkpoint in *checkpoint_local_path*. Along with the checkpoints, this metadata also gets saved to Amazon S3 when the instance is interrupted. Upon restart, along with the checkpoints, this metadata is also copied back to the instance. The Amazon SageMaker Debugger reads the last saved state from the metadata and continues to emit the tensors from that step. This minimizes the emission of duplicate tensors. Note that currently, the rule job continues to wait till even if the training job is interrupted.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Make sure to set this to your bucket and location\n", + "# Ensure that the bucket exists in the same region as that of current region.\n", + "BUCKET_NAME = sagemaker_session.default_bucket()\n", + "LOCATION_IN_BUCKET = 'smdebug-checkpoints'\n", + "\n", + "checkpoint_s3_bucket = 's3://{BUCKET_NAME}/{LOCATION_IN_BUCKET}'.format(BUCKET_NAME=BUCKET_NAME, LOCATION_IN_BUCKET=LOCATION_IN_BUCKET)\n", + "\n", + "# Local path where the model will save its checkpoints.\n", + "checkpoint_local_path = '/opt/ml/checkpoints'\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator = MXNet(\n", + " role=sagemaker.get_execution_role(),\n", + " base_job_name='smdebugger-spot-training-demo-mxnet',\n", + " train_instance_count=1,\n", + " train_instance_type='ml.m4.xlarge',\n", + " train_volume_size = 400,\n", + " entry_point=entrypoint_script,\n", + " hyperparameters = hyperparameters,\n", + " framework_version='1.6.0',\n", + " py_version='py3',\n", + " train_max_run=3600,\n", + " sagemaker_session=sagemaker_session,\n", + " \n", + " # Parameters required to enable spot training.\n", + " train_use_spot_instances=True, #Set it to True to enable spot training.\n", + " train_max_wait = 3600, #This should be equal to or greater than train_max_run in seconds\n", + " checkpoint_s3_uri = checkpoint_s3_bucket, #Set the S3 URI to store the checkpoints.\n", + " checkpoint_local_path = checkpoint_local_path, #This is default path where checkpoints will be stored. The training script should generate the checkpoints.\n", + " \n", + " ## Rule parameter\n", + " rules = [Rule.sagemaker(rule_configs.vanishing_gradient())]\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator.fit()" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.4" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/sagemaker-debugger/mxnet_spot_training/mxnet_gluon_spot_training.py b/sagemaker-debugger/mxnet_spot_training/mxnet_gluon_spot_training.py new file mode 100644 index 0000000000..6a840a313a --- /dev/null +++ b/sagemaker-debugger/mxnet_spot_training/mxnet_gluon_spot_training.py @@ -0,0 +1,154 @@ +# Standard Library +import argparse +import random + +# Third Party +import mxnet as mx +import numpy as np +from mxnet import autograd, gluon +from mxnet.gluon import nn +import os + + +def parse_args(): + parser = argparse.ArgumentParser( + description="Train a mxnet gluon model for FashonMNIST dataset" + ) + parser.add_argument("--batch-size", type=int, default=256, help="Batch size") + parser.add_argument("--epochs", type=int, default=1, help="Number of Epochs") + parser.add_argument("--learning_rate", type=float, default=0.1) + parser.add_argument( + "--context", type=str, default="cpu", help="Context can be either cpu or gpu" + ) + parser.add_argument( + "--checkpoint-path", + type=str, + default="/opt/ml/checkpoints", + help="Path where checkpoints will be saved.", + ) + + opt = parser.parse_args() + return opt + + +def test(ctx, net, val_data): + metric = mx.metric.Accuracy() + for i, (data, label) in enumerate(val_data): + data = data.as_in_context(ctx) + label = label.as_in_context(ctx) + output = net(data) + metric.update([label], [output]) + + return metric.get() + + +def train_model(net, epochs, ctx, learning_rate, momentum, train_data, val_data, checkpoint_path): + # Collect all parameters from net and its children, then initialize them. + net.initialize(mx.init.Xavier(magnitude=2.24), ctx=ctx) + # Trainer is for updating parameters with gradient. + trainer = gluon.Trainer( + net.collect_params(), "sgd", {"learning_rate": learning_rate, "momentum": momentum} + ) + metric = mx.metric.Accuracy() + loss = gluon.loss.SoftmaxCrossEntropyLoss() + + for epoch in range(epochs): + # reset data iterator and metric at begining of epoch. + metric.reset() + for i, (data, label) in enumerate(train_data): + # Copy data to ctx if necessary + data = data.as_in_context(ctx) + label = label.as_in_context(ctx) + # Start recording computation graph with record() section. + # Recorded graphs can then be differentiated with backward. + with autograd.record(): + output = net(data) + L = loss(output, label) + L.backward() + # take a gradient step with batch_size equal to data.shape[0] + trainer.step(data.shape[0]) + # update metric at last. + metric.update([label], [output]) + + if i % 100 == 0 and i > 0: + name, acc = metric.get() + print("[Epoch %d Batch %d] Training: %s=%f" % (epoch, i, name, acc)) + + name, acc = metric.get() + print("[Epoch %d] Training: %s=%f" % (epoch, name, acc)) + name, val_acc = test(ctx, net, val_data) + print("[Epoch %d] Validation: %s=%f" % (epoch, name, val_acc)) + param_file = "{0}/params_{1}.params".format(checkpoint_path, epoch) + print ("Saving params to: " + param_file) + net.save_parameters(param_file) + + +def transformer(data, label): + data = data.reshape((-1,)).astype(np.float32) / 255 + return data, label + + +def prepare_data(batch_size): + train_data = gluon.data.DataLoader( + gluon.data.vision.MNIST("./data", train=True, transform=transformer), + batch_size=batch_size, + shuffle=True, + last_batch="discard", + ) + + val_data = gluon.data.DataLoader( + gluon.data.vision.MNIST("./data", train=False, transform=transformer), + batch_size=batch_size, + shuffle=False, + ) + return train_data, val_data + + +# Create a model using gluon API. The hook is currently +# supports MXNet gluon models only. +def create_gluon_model(): + net = nn.Sequential() + with net.name_scope(): + net.add(nn.Dense(128, activation="relu")) + net.add(nn.Dense(64, activation="relu")) + net.add(nn.Dense(10)) + return net + + +def validate(): + import os, json + with open('/opt/ml/input/config/debughookconfig.json') as jsondata: + configs = json.load(jsondata) + print("DEBUG HOOK CONFIGURATION: ") + print(json.dumps(configs, indent=4)) + print("Validation Complete") + + +def main(): + opt = parse_args() + mx.random.seed(128) + random.seed(12) + np.random.seed(2) + + context = mx.cpu() if opt.context.lower() == "cpu" else mx.gpu() + # Create a Gluon Model. + net = create_gluon_model() + + # Start the training. + train_data, val_data = prepare_data(opt.batch_size) + + train_model( + net=net, + epochs=opt.epochs, + ctx=context, + learning_rate=opt.learning_rate, + momentum=0.9, + train_data=train_data, + val_data=val_data, + checkpoint_path = opt.checkpoint_path + ) + validate() + + +if __name__ == "__main__": + main() diff --git a/sagemaker-debugger/pytorch_custom_container/pytorch_byoc_smdebug.ipynb b/sagemaker-debugger/pytorch_custom_container/pytorch_byoc_smdebug.ipynb new file mode 100644 index 0000000000..17c9abfa0d --- /dev/null +++ b/sagemaker-debugger/pytorch_custom_container/pytorch_byoc_smdebug.ipynb @@ -0,0 +1,435 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Using Amazon SageMaker Debugger with your own PyTorch container" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Amazon SageMaker is a managed platform to build, train and host machine learning models. Amazon SageMaker Debugger is a new feature which offers capability to debug machine learning and deep learning models during training by identifying and detecting problems with the models in real time." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Amazon SageMaker also gives you the option of bringing your own algorithms packaged in a custom container, that can then be trained and deployed in the Amazon SageMaker environment. \n", + "\n", + "This notebook guides you through an example of using your own container with PyTorch for training, along with the recently added feature, Amazon SageMaker Debugger." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## How does Amazon SageMaker Debugger work?\n", + "\n", + "Amazon SageMaker Debugger lets you go beyond just looking at scalars like losses and accuracies during training and gives you full visibility into all tensors 'flowing through the graph' during training. Furthermore, it helps you monitor your training in real time using rules and CloudWatch events and react to issues like, for example, common training issues such as vanishing gradients or poor weight initialization.\n", + "\n", + "### Concepts\n", + "* **Tensor**: These are the artifacts that define the state of the training job at any particular instant in its lifecycle.\n", + "* **Debug Hook**: Captures the tensors flowing through the training computational graph every N steps.\n", + "* **Debugging Rule**: Logic to analyze the tensors captured by the hook and report anomalies.\n", + "\n", + "With these concepts in mind, let's understand the overall flow of things which Amazon SageMaker Debugger uses to orchestrate debugging.\n", + "\n", + "It operates in two steps - saving tensors and analysis.\n", + "\n", + "### Saving tensors\n", + "\n", + "Tensors that debug hook captures are stored in S3 location specified by you. There are two ways you can configure Amazon SageMaker Debugger for storage:\n", + "\n", + " 1. With no changes to your training script: If you use any of SageMaker provided [Deep Learning containers](https://docs.aws.amazon.com/sagemaker/latest/dg/pre-built-containers-frameworks-deep-learning.html) then you don't need to make any changes to your training script for tensors to be stored. Amazon SageMaker Debugger will use the configuration you provide in the framework `Estimator` to save tensors in the fashion you specify.\n", + " 2. Orchestrating your script to store tensors: Amazon SageMaker Debugger exposes a library which allows you to capture these tensors and save them for analysis. It's highly customizable and allows to save the specific tensors you want at different frequencies and configurations. Refer to the [DeveloperGuide](https://github.com/awslabs/sagemaker-debugger/tree/master/docs) for details on how to use Amazon SageMaker Debugger with your choice of framework in your training script.\n", + "\n", + "### Analysis of tensors\n", + "\n", + "Once tensors are saved, Amazon SageMaker Debugger can be configured to run debugging ***Rules*** on them. On a very broad level, a rule is a python script used to detect certain conditions during training. Some of the conditions that a data scientist training an algorithm might be interested in are monitoring for gradients getting too large or too small, detecting overfitting, and so on. Amazon SageMaker Debugger comes pre-packaged with certain built-in rules. You can also write your own rules using the Amazon SageMaker Debugger APIs. You can also analyze raw tensor data outside of the Rules construct in a notebook, using Amazong Sagemaker Debugger's full set of APIs." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup\n", + "\n", + "To successfully execute this example, the following packages need to be installed in your container:\n", + "\n", + "* PyTorch v1.3.1\n", + "* Torchvision v0.4.2\n", + "* Amazon SageMaker Debugger (smdebug)\n", + "\n", + "`!python -m pip install smdebug`" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Bring Your Own PyTorch for training\n", + "\n", + "In this notebook, we will train a PyTorch model with Amazon SageMaker Debugger enabled. We can do that by using custom PyTorch container, enabling Amazon SageMaker Debugger in the training script, and bringing it to Amazon SageMaker for training.\n", + "\n", + "Note: The changes to the training script that are mentioned in this notebook are only required when a custom container is used. Amazon SageMaker Debugger will be automatically enabled (and not require any changes to training script) if you use the SageMaker Deep Learning Container for PyTorch.\n", + "\n", + "We will focus on how to modify a training script to save tensors by registering debug hooks and specifying which tensors to save.\n", + "\n", + "The model used for this notebook is trained with the MNIST dataset. The example is based on https://github.com/pytorch/examples/blob/master/mnist/main.py" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Modifying the training script\n", + "\n", + "Before we define the Estimator object and start training, we will explore parts of the training script in detail. (The entire training script can be found at [./scripts/pytorch_mnist.py](./scripts/pytorch_mnist.py))." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Step 1: Import Amazon SageMaker Debugger.\n", + "\n", + "```python\n", + "import smdebug.pytorch as smd\n", + "```\n", + "\n", + "Step 2: Create a debugger hook to save tensors of specified collections. Apart from a list of collections, the hook takes the save config and output directory as parameters. The output directory is a mandatory parameter. All these parameters can be specified in the config json file.\n", + "\n", + "```python\n", + "def create_smdebug_hook():\n", + " # This allows you to create the hook from the configuration you pass to the SageMaker pySDK\n", + " hook = smd.Hook.create_from_json_file()\n", + " return hook\n", + "```\n", + "\n", + "Step 3: Register the hook for all layers in the model\n", + "\n", + "```python\n", + "hook.register_hook(model)\n", + "```\n", + "\n", + "Step 4: For PyTorch, if you use a Loss module for loss, add a step to register loss\n", + "\n", + "```python\n", + "hook.register_loss(criterion)\n", + "```\n", + "\n", + "Once these changes are made in the training script, Amazon SageMaker Debugger will start saving tensors, belonging to the specified collections, during training into the specfied output directory.\n", + "\n", + "Now, we will setup the Estimator and start training using modified training script." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from __future__ import absolute_import\n", + "\n", + "import boto3\n", + "import pytest\n", + "from sagemaker.pytorch import PyTorch\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.debugger import Rule, DebuggerHookConfig, TensorBoardOutputConfig, CollectionConfig, rule_configs" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Define the configuration of training to run. `ecr_image` is where you can provide link to your bring-your-own-container. `hyperparameters` are fed into the training script with data directory (directory where the training dataset is stored) and smdebug directory (directory where the tensors will be saved) are mandatory fields." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "role = get_execution_role()\n", + "training_dir = '/tmp/pytorch-smdebug'\n", + "smdebug_mnist_script = 'scripts/pytorch_mnist.py'\n", + "\n", + "hyperparameters = {'random_seed': True, 'num_steps': 50, 'epochs': 5,\n", + " 'data_dir':training_dir}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\"rules\" is a new parameter that will accept a list of rules you wish to evaluate the tensors output against. For rules, Amazon SageMaker Debugger supports two types:\n", + "* SageMaker Rules: These are rules specially curated by the data science and engineering teams in Amazon SageMaker which you can opt to evaluate against your training job.\n", + "* Custom Rules: You can optionally choose to write your own rule as a Python source file and have it evaluated against your training job. To provide Amazon SageMaker Debugger to evaluate this rule, you would have to provide the S3 location of the rule source and the evaluator image." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this example, we will use the VanishingGradient which will attempt to evaluate if there are vanishing gradients. Alternatively, you could write your own custom rule, as demonstrated in [this](https://github.com/aws/amazon-sagemaker-examples-staging/blob/master/sagemaker-debugger/tensorflow_keras_custom_rule/tf-keras-custom-rule.ipynb) example." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "rules = [\n", + " Rule.sagemaker(rule_configs.vanishing_gradient())\n", + "]\n", + "\n", + "estimator = PyTorch(entry_point=smdebug_mnist_script,\n", + " base_job_name='smdebugger-demo-mnist-pytorch',\n", + " role=role,\n", + " train_instance_count=1,\n", + " train_instance_type='ml.m4.xlarge',\n", + " train_volume_size=400,\n", + " train_max_run=3600,\n", + " hyperparameters=hyperparameters,\n", + " framework_version='1.3.1',\n", + " py_version='py3',\n", + " ## New parameter\n", + " rules = rules\n", + " )" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "*Note that Amazon Sagemaker Debugger is only supported for py_version='py3'.*" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the next step we kick off traning job using Estimator object we created above. Note that the way training job starts here is asynchronous. That means that notebook is not blocked and control flow is passed to next cell." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator.fit(wait=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Result\n", + "\n", + "As a result of calling the fit() Amazon SageMaker Debugger kicked off a rule evaluation job to monitor loss decrease, in parallel with the training job. The rule evaluation status(es) will be visible in the training logs at regular intervals. As you can see, in the summary, there was no step in the training which reported vanishing gradients in the tensors. Although, the loss was not found to be decreasing at step 1900." + ] + }, + { + "cell_type": "code", + "execution_count": 23, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[{'RuleConfigurationName': 'VanishingGradient',\n", + " 'RuleEvaluationJobArn': 'arn:aws:sagemaker:us-west-2:072677473360:processing-job/smdebugger-demo-mnist-pyto-vanishinggradient-52ca2f8e',\n", + " 'RuleEvaluationStatus': 'NoIssuesFound',\n", + " 'LastModifiedTime': datetime.datetime(2019, 12, 3, 0, 50, 53, 50000, tzinfo=tzlocal())}]" + ] + }, + "execution_count": 23, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "estimator.latest_training_job.rule_job_summary()" + ] + }, + { + "cell_type": "code", + "execution_count": 24, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'VanishingGradient': 'https://us-west-2.console.aws.amazon.com/cloudwatch/home?region=us-west-2#logStream:group=/aws/sagemaker/ProcessingJobs;prefix=smdebugger-demo-mnist-pyto-VanishingGradient-52ca2f8e;streamFilter=typeLogStreamPrefix'}" + ] + }, + "execution_count": 24, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "def _get_rule_job_name(training_job_name, rule_configuration_name, rule_job_arn):\n", + " \"\"\"Helper function to get the rule job name with correct casing\"\"\"\n", + " return \"{}-{}-{}\".format(\n", + " training_job_name[:26], rule_configuration_name[:26], rule_job_arn[-8:]\n", + " )\n", + " \n", + "def _get_cw_url_for_rule_job(rule_job_name, region):\n", + " return \"https://{}.console.aws.amazon.com/cloudwatch/home?region={}#logStream:group=/aws/sagemaker/ProcessingJobs;prefix={};streamFilter=typeLogStreamPrefix\".format(region, region, rule_job_name)\n", + "\n", + "\n", + "def get_rule_jobs_cw_urls(estimator):\n", + " region = boto3.Session().region_name\n", + " training_job = estimator.latest_training_job\n", + " training_job_name = training_job.describe()[\"TrainingJobName\"]\n", + " rule_eval_statuses = training_job.describe()[\"DebugRuleEvaluationStatuses\"]\n", + " \n", + " result={}\n", + " for status in rule_eval_statuses:\n", + " if status.get(\"RuleEvaluationJobArn\", None) is not None:\n", + " rule_job_name = _get_rule_job_name(training_job_name, status[\"RuleConfigurationName\"], status[\"RuleEvaluationJobArn\"])\n", + " result[status[\"RuleConfigurationName\"]] = _get_cw_url_for_rule_job(rule_job_name, region)\n", + " return result\n", + "\n", + "get_rule_jobs_cw_urls(estimator)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Analysis\n", + "\n", + "Another aspect of the Amazon SageMaker Debugger is analysis. It allows us to perform interactive exploration of the tensors saved in real time or after the job. Here we focus on after-the-fact analysis of the above job. We import the smdebug library, which defines a concept of Trial that represents a single training run. Note how we fetch the path to debugger artifacts for the above job." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[2019-12-03 00:50:59.439 ip-172-16-56-202:4023 INFO s3_trial.py:42] Loading trial debug-output at path s3://sagemaker-us-west-2-072677473360/smdebugger-demo-mnist-pytorch-2019-12-03-00-44-45-065/debug-output\n" + ] + } + ], + "source": [ + "from smdebug.trials import create_trial\n", + "trial = create_trial(estimator.latest_job_debugger_artifacts_path())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can list all the tensors that were recorded to know what we want to plot." + ] + }, + { + "cell_type": "code", + "execution_count": 26, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[2019-12-03 00:51:01.336 ip-172-16-56-202:4023 INFO trial.py:197] Training has ended, will refresh one final time in 1 sec.\n", + "[2019-12-03 00:51:02.375 ip-172-16-56-202:4023 INFO trial.py:209] Loaded all steps\n" + ] + }, + { + "data": { + "text/plain": [ + "['CrossEntropyLoss_input_0',\n", + " 'CrossEntropyLoss_input_1',\n", + " 'CrossEntropyLoss_output_0',\n", + " 'gradient/Net_conv1.bias',\n", + " 'gradient/Net_conv1.weight',\n", + " 'gradient/Net_conv2.bias',\n", + " 'gradient/Net_conv2.weight',\n", + " 'gradient/Net_fc1.bias',\n", + " 'gradient/Net_fc1.weight',\n", + " 'gradient/Net_fc2.bias',\n", + " 'gradient/Net_fc2.weight']" + ] + }, + "execution_count": 26, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "trial.tensor_names()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can also retrieve tensors by some default collections that smdebug creates from your training job. Here we are interested in the losses collection, so we can retrieve the names of tensors in losses collection as follows. Amazon SageMaker Debugger creates default collections such as weights, gradients, biases, losses automatically. You can also create custom collections from your tensors." + ] + }, + { + "cell_type": "code", + "execution_count": 27, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['CrossEntropyLoss_input_0',\n", + " 'CrossEntropyLoss_input_1',\n", + " 'CrossEntropyLoss_output_0']" + ] + }, + "execution_count": 27, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "trial.tensor_names(collection=\"losses\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_python3", + "language": "python", + "name": "conda_python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/sagemaker-debugger/pytorch_custom_container/scripts/pytorch_mnist.py b/sagemaker-debugger/pytorch_custom_container/scripts/pytorch_mnist.py new file mode 100644 index 0000000000..1ccbe6058a --- /dev/null +++ b/sagemaker-debugger/pytorch_custom_container/scripts/pytorch_mnist.py @@ -0,0 +1,195 @@ +""" +This script is a simple MNIST training script which uses PyTorch. +It has been orchestrated with Amazon SageMaker Debugger hooks to allow saving tensors during training. +These hooks have been instrumented to read from json configuration that SageMaker will put in the training container. +Configuration provided to the SageMaker python SDK when creating a job will be passed on to the hook. +This allows you to use the same script with differing configurations across different runs. +If you use an official SageMaker Framework container (i.e. AWS Deep Learning Container), then +you do not have to orchestrate your script as below. Hooks will automatically be added in those environments. +For more information, please refer to https://github.com/awslabs/sagemaker-debugger/blob/master/docs/sagemaker.md +""" + + +from __future__ import absolute_import +import argparse +import logging +import os +import sys + +import cv2 as cv +import sagemaker_containers +import torch +import torch.nn as nn +import torch.nn.functional as F +import torch.optim as optim +import torch.utils.data +from torchvision import datasets, transforms + +# SageMaker Debugger: Import the package +import smdebug.pytorch as smd + +import numpy as np +import random + +logger = logging.getLogger(__name__) +logger.setLevel(logging.DEBUG) +logger.addHandler(logging.StreamHandler(sys.stdout)) + + +# Based on https://github.com/pytorch/examples/blob/master/mnist/main.py +class Net(nn.Module): + def __init__(self): + logger.info("Create neural network module") + + super(Net, self).__init__() + self.conv1 = nn.Conv2d(1, 10, kernel_size=5) + self.conv2 = nn.Conv2d(10, 20, kernel_size=5) + self.conv2_drop = nn.Dropout2d() + self.fc1 = nn.Linear(320, 50) + self.fc2 = nn.Linear(50, 10) + + def forward(self, x): + x = F.relu(F.max_pool2d(self.conv1(x), 2)) + x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2)) + x = x.view(-1, 320) + x = F.relu(self.fc1(x)) + x = F.dropout(x, training=self.training) + x = self.fc2(x) + return F.log_softmax(x, dim=1) + + +def parse_args(): + env = sagemaker_containers.training_env() + parser = argparse.ArgumentParser(description="PyTorch MNIST Example") + + parser.add_argument('--data_dir', type=str) + parser.add_argument("--batch-size", type=int, default=4, help="Batch size") + parser.add_argument("--epochs", type=int, default=1, help="Number of Epochs") + + # SageMaker Debugger: Mention the path where you would like the tensors to be + # saved. + parser.add_argument( + "--smdebug_path", + type=str, + default=None, + help="S3 URI of the bucket where tensor data will be stored.", + ) + parser.add_argument("--learning_rate", type=float, default=0.001) + parser.add_argument("--momentum", type=float, default=0.9) + parser.add_argument("--random_seed", type=bool, default=False) + parser.add_argument( + "--num_steps", + type=int, + default=50, + help="Reduce the number of training " + "and evaluation steps to the give number if desired." + "If this is not passed, trains for one epoch " + "of training and validation data", + ) + parser.add_argument('--log_interval', type=int, default=100, metavar='N', + help='how many batches to wait before logging training status') + + opt = parser.parse_args() + return opt + + +def _get_train_data_loader(batch_size, training_dir): + logger.info("Get train data loader") + dataset = datasets.MNIST(training_dir, train=True, download=True, transform=transforms.Compose([ + transforms.ToTensor(), + transforms.Normalize((0.1307,), (0.3081,)) + ])) + return torch.utils.data.DataLoader(dataset, batch_size=batch_size, shuffle=True, + num_workers=4) + + +def _get_test_data_loader(test_batch_size, training_dir): + logger.info("Get test data loader") + return torch.utils.data.DataLoader( + datasets.MNIST(training_dir, train=False, download=True, transform=transforms.Compose([ + transforms.ToTensor(), + transforms.Normalize((0.1307,), (0.3081,)) + ])), + batch_size=test_batch_size, shuffle=False, num_workers=4) + + +# SageMaker Debugger: This function created the debug hook required to log tensors. +# In this example, weight, gradients and losses will be logged at steps 1,2, and 3, +# and saved to the output directory specified in hyperparameters. +def create_smdebug_hook(): + # This allows you to create the hook from the configuration you pass to the SageMaker pySDK + hook = smd.Hook.create_from_json_file() + return hook + + +def train(model, device, optimizer, hook, epochs, log_interval, training_dir): + criterion = nn.CrossEntropyLoss() + # SageMaker Debugger: If you are using a Loss module and would like to save the + # values as we are doing in this example, then add a call to register loss. + hook.register_loss(criterion) + + trainloader = _get_train_data_loader(4, training_dir) + validloader = _get_test_data_loader(4, training_dir) + + for epoch in range(epochs): + model.train() + hook.set_mode(smd.modes.TRAIN) + for i, data in enumerate(trainloader): + inputs, labels = data + optimizer.zero_grad() + output = model(inputs) + loss = criterion(output, labels) + loss.backward() + optimizer.step() + + if i % log_interval == 0: + logger.debug('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format( + epoch, i * len(data), len(trainloader.sampler), + 100. * i / len(trainloader), loss.item())) + + test(model, hook, validloader, device, criterion) + + +def test(model, hook, test_loader, device, loss_fn): + model.eval() + hook.set_mode(smd.modes.EVAL) + test_loss = 0 + correct = 0 + with torch.no_grad(): + for data, target in test_loader: + data, target = data.to(device), target.to(device) + output = model(data) + test_loss += loss_fn(output, target).item() # sum up batch loss + pred = output.max(1, keepdim=True)[1] # get the index of the max log-probability + correct += pred.eq(target.view_as(pred)).sum().item() + + test_loss /= len(test_loader.dataset) + logger.debug('Test set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format( + test_loss, correct, len(test_loader.dataset), + 100. * correct / len(test_loader.dataset))) + + +def main(): + opt = parse_args() + + if opt.random_seed: + torch.manual_seed(128) + random.seed(12) + np.random.seed(2) + + training_dir = opt.data_dir + + device = torch.device("cpu") + model = Net().to(device) + + # SageMaker Debugger: Create the debug hook, + # and register the hook to save tensors. + hook = create_smdebug_hook() + hook.register_hook(model) + + optimizer = optim.SGD(model.parameters(), lr=opt.learning_rate, momentum=opt.momentum) + train(model, device, optimizer, hook, opt.epochs, opt.log_interval, training_dir) + print("Training is complete") + +if __name__ == "__main__": + main() diff --git a/sagemaker-debugger/tensorflow_action_on_rule/src/mnist_byoc.py b/sagemaker-debugger/tensorflow_action_on_rule/src/mnist_byoc.py new file mode 100644 index 0000000000..29ca87c453 --- /dev/null +++ b/sagemaker-debugger/tensorflow_action_on_rule/src/mnist_byoc.py @@ -0,0 +1,136 @@ +""" +This script is a simple MNIST training script which uses Tensorflow's Estimator interface. +It has been orchestrated with SageMaker Debugger hooks to allow saving tensors during training. +These hooks have been instrumented to read from json configuration that SageMaker will put in the training container. +Configuration provided to the SageMaker python SDK when creating a job will be passed on to the hook. +This allows you to use the same script with differing configurations across different runs. +If you use an official SageMaker Framework container (i.e. AWS Deep Learning Container), then +you do not have to orchestrate your script as below. Hooks will automatically be added in those environments. +For more information, please refer to https://github.com/awslabs/sagemaker-debugger/blob/master/docs/sagemaker.md +""" + +# Standard Library +import argparse +import random + +# Third Party +import numpy as np +import tensorflow as tf +import smdebug.tensorflow as smd + +import logging +logging.getLogger().setLevel(logging.INFO) + +parser = argparse.ArgumentParser() +parser.add_argument("--lr", type=float, default=0.001) +parser.add_argument("--random_seed", type=bool, default=False) +parser.add_argument("--num_epochs", type=int, default=5, help="Number of epochs to train for") +parser.add_argument( + "--num_steps", + type=int, + help="Number of steps to train for. If this" "is passed, it overrides num_epochs", +) +parser.add_argument( + "--num_eval_steps", + type=int, + help="Number of steps to evaluate for. If this" + "is passed, it doesnt evaluate over the full eval set", +) +parser.add_argument("--model_dir", type=str, default="/tmp/mnist_model") +args = parser.parse_args() + +if args.random_seed: + tf.set_random_seed(2) + np.random.seed(2) + random.seed(12) + +# This allows you to create the hook from the configuration you pass to the SageMaker pySDK +hook = smd.EstimatorHook.create_from_json_file() + +def cnn_model_fn(features, labels, mode): + """Model function for CNN.""" + # Input Layer + input_layer = tf.reshape(features["x"], [-1, 28, 28, 1]) + + # Convolutional Layer #1 + conv1 = tf.layers.conv2d( + inputs=input_layer, filters=32, kernel_size=[5, 5], padding="same", activation=tf.nn.relu + ) + + # Pooling Layer #1 + pool1 = tf.layers.max_pooling2d(inputs=conv1, pool_size=[2, 2], strides=2) + + # Convolutional Layer #2 and Pooling Layer #2 + conv2 = tf.layers.conv2d( + inputs=pool1, filters=64, kernel_size=[5, 5], padding="same", activation=tf.nn.relu + ) + pool2 = tf.layers.max_pooling2d(inputs=conv2, pool_size=[2, 2], strides=2) + + # Dense Layer + pool2_flat = tf.reshape(pool2, [-1, 7 * 7 * 64]) + dense = tf.layers.dense(inputs=pool2_flat, units=1024, activation=tf.nn.relu) + dropout = tf.layers.dropout( + inputs=dense, rate=0.4, training=mode == tf.estimator.ModeKeys.TRAIN + ) + + # Logits Layer + logits = tf.layers.dense(inputs=dropout, units=10) + + predictions = { + # Generate predictions (for PREDICT and EVAL mode) + "classes": tf.argmax(input=logits, axis=1), + # Add `softmax_tensor` to the graph. It is used for PREDICT and by the + # `logging_hook`. + "probabilities": tf.nn.softmax(logits, name="softmax_tensor"), + } + + if mode == tf.estimator.ModeKeys.PREDICT: + return tf.estimator.EstimatorSpec(mode=mode, predictions=predictions) + + # Calculate Loss (for both TRAIN and EVAL modes) + loss = tf.losses.sparse_softmax_cross_entropy(labels=labels, logits=logits) + + # Configure the Training Op (for TRAIN mode) + if mode == tf.estimator.ModeKeys.TRAIN: + optimizer = tf.train.GradientDescentOptimizer(learning_rate=args.lr) + + # SMD: Wrap your optimizer as follows to help SageMaker Debugger identify gradients + # This does not change your optimization logic, it returns back the same optimizer + optimizer = hook.wrap_optimizer(optimizer) + + train_op = optimizer.minimize(loss=loss, global_step=tf.train.get_global_step()) + return tf.estimator.EstimatorSpec(mode=mode, loss=loss, train_op=train_op) + + # Add evaluation metrics (for EVAL mode) + eval_metric_ops = { + "accuracy": tf.metrics.accuracy(labels=labels, predictions=predictions["classes"]) + } + return tf.estimator.EstimatorSpec(mode=mode, loss=loss, eval_metric_ops=eval_metric_ops) + + +# Load training and eval data +((train_data, train_labels), (eval_data, eval_labels)) = tf.keras.datasets.mnist.load_data() + +train_data = train_data / np.float32(255) +train_labels = train_labels.astype(np.int32) # not required + +eval_data = eval_data / np.float32(255) +eval_labels = eval_labels.astype(np.int32) # not required + +mnist_classifier = tf.estimator.Estimator(model_fn=cnn_model_fn, model_dir=args.model_dir) + +train_input_fn = tf.estimator.inputs.numpy_input_fn( + x={"x": train_data}, y=train_labels, batch_size=128, num_epochs=args.num_epochs, shuffle=True +) + +eval_input_fn = tf.estimator.inputs.numpy_input_fn( + x={"x": eval_data}, y=eval_labels, num_epochs=1, shuffle=False +) + +# Set training mode so SMDebug can classify the steps into training mode +hook.set_mode(smd.modes.TRAIN) +mnist_classifier.train(input_fn=train_input_fn, steps=args.num_steps, hooks=[hook]) + +# Set eval mode so SMDebug can classify the steps into eval mode +hook.set_mode(smd.modes.EVAL) +mnist_classifier.evaluate(input_fn=eval_input_fn, steps=args.num_eval_steps, hooks=[hook]) diff --git a/sagemaker-debugger/tensorflow_action_on_rule/src/mnist_zerocodechange.py b/sagemaker-debugger/tensorflow_action_on_rule/src/mnist_zerocodechange.py new file mode 100644 index 0000000000..5107e3e2a7 --- /dev/null +++ b/sagemaker-debugger/tensorflow_action_on_rule/src/mnist_zerocodechange.py @@ -0,0 +1,124 @@ +""" +This script is a simple MNIST training script which uses Tensorflow's Estimator interface. +It is designed to be used with SageMaker Debugger in an official SageMaker Framework container (i.e. AWS Deep Learning Container). You will notice that this script looks exactly like a normal TensorFlow training script. +The hook needed by SageMaker Debugger to save tensors during training will be automatically added in those environments. +The hook will load configuration from json configuration that SageMaker will put in the training container from the configuration provided using the SageMaker python SDK when creating a job. +For more information, please refer to https://github.com/awslabs/sagemaker-debugger/blob/master/docs/sagemaker.md +""" + +# Standard Library +import argparse +import random + +# Third Party +import numpy as np +import tensorflow as tf + +import logging +logging.getLogger().setLevel(logging.INFO) + +parser = argparse.ArgumentParser() +parser.add_argument("--lr", type=float, default=0.001) +parser.add_argument("--random_seed", type=bool, default=False) +parser.add_argument("--num_epochs", type=int, default=5, help="Number of epochs to train for") +parser.add_argument( + "--num_steps", + type=int, + help="Number of steps to train for. If this" "is passed, it overrides num_epochs", +) +parser.add_argument( + "--num_eval_steps", + type=int, + help="Number of steps to evaluate for. If this" + "is passed, it doesnt evaluate over the full eval set", +) +parser.add_argument("--model_dir", type=str, default="/tmp/mnist_model") +args = parser.parse_args() + +# these random seeds are only intended for test purpose. +# for now, 2,2,12 could promise no assert failure when running tests. +# if you wish to change the number, notice that certain steps' tensor value may be capable of variation +if args.random_seed: + tf.set_random_seed(2) + np.random.seed(2) + random.seed(12) + + +def cnn_model_fn(features, labels, mode): + """Model function for CNN.""" + # Input Layer + input_layer = tf.reshape(features["x"], [-1, 28, 28, 1]) + + # Convolutional Layer #1 + conv1 = tf.layers.conv2d( + inputs=input_layer, filters=32, kernel_size=[5, 5], padding="same", activation=tf.nn.relu + ) + + # Pooling Layer #1 + pool1 = tf.layers.max_pooling2d(inputs=conv1, pool_size=[2, 2], strides=2) + + # Convolutional Layer #2 and Pooling Layer #2 + conv2 = tf.layers.conv2d( + inputs=pool1, filters=64, kernel_size=[5, 5], padding="same", activation=tf.nn.relu + ) + pool2 = tf.layers.max_pooling2d(inputs=conv2, pool_size=[2, 2], strides=2) + + # Dense Layer + pool2_flat = tf.reshape(pool2, [-1, 7 * 7 * 64]) + dense = tf.layers.dense(inputs=pool2_flat, units=1024, activation=tf.nn.relu) + dropout = tf.layers.dropout( + inputs=dense, rate=0.4, training=mode == tf.estimator.ModeKeys.TRAIN + ) + + # Logits Layer + logits = tf.layers.dense(inputs=dropout, units=10) + + predictions = { + # Generate predictions (for PREDICT and EVAL mode) + "classes": tf.argmax(input=logits, axis=1), + # Add `softmax_tensor` to the graph. It is used for PREDICT and by the + # `logging_hook`. + "probabilities": tf.nn.softmax(logits, name="softmax_tensor"), + } + + if mode == tf.estimator.ModeKeys.PREDICT: + return tf.estimator.EstimatorSpec(mode=mode, predictions=predictions) + + # Calculate Loss (for both TRAIN and EVAL modes) + loss = tf.losses.sparse_softmax_cross_entropy(labels=labels, logits=logits) + + # Configure the Training Op (for TRAIN mode) + if mode == tf.estimator.ModeKeys.TRAIN: + optimizer = tf.train.GradientDescentOptimizer(learning_rate=args.lr) + train_op = optimizer.minimize(loss=loss, global_step=tf.train.get_global_step()) + return tf.estimator.EstimatorSpec(mode=mode, loss=loss, train_op=train_op) + + # Add evaluation metrics (for EVAL mode) + eval_metric_ops = { + "accuracy": tf.metrics.accuracy(labels=labels, predictions=predictions["classes"]) + } + return tf.estimator.EstimatorSpec(mode=mode, loss=loss, eval_metric_ops=eval_metric_ops) + + +# Load training and eval data +((train_data, train_labels), (eval_data, eval_labels)) = tf.keras.datasets.mnist.load_data() + +train_data = train_data / np.float32(255) +train_labels = train_labels.astype(np.int32) # not required + +eval_data = eval_data / np.float32(255) +eval_labels = eval_labels.astype(np.int32) # not required + +mnist_classifier = tf.estimator.Estimator(model_fn=cnn_model_fn, model_dir=args.model_dir) + +train_input_fn = tf.estimator.inputs.numpy_input_fn( + x={"x": train_data}, y=train_labels, batch_size=128, num_epochs=args.num_epochs, shuffle=True +) + +eval_input_fn = tf.estimator.inputs.numpy_input_fn( + x={"x": eval_data}, y=eval_labels, num_epochs=1, shuffle=False +) + +mnist_classifier.train(input_fn=train_input_fn, steps=args.num_steps) + +mnist_classifier.evaluate(input_fn=eval_input_fn, steps=args.num_eval_steps) diff --git a/sagemaker-debugger/tensorflow_action_on_rule/tf-mnist-stop-training-job.ipynb b/sagemaker-debugger/tensorflow_action_on_rule/tf-mnist-stop-training-job.ipynb new file mode 100644 index 0000000000..60284c9d9c --- /dev/null +++ b/sagemaker-debugger/tensorflow_action_on_rule/tf-mnist-stop-training-job.ipynb @@ -0,0 +1,327 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Amazon SageMaker Debugger - Reacting to Cloudwatch Events from Rules\n", + "[Amazon SageMaker](https://aws.amazon.com/sagemaker/) is managed platform to build, train and host maching learning models. Amazon SageMaker Debugger is a new feature which offers the capability to debug machine learning models during training by identifying and detecting problems with the models in near real time. \n", + "\n", + "In this notebook, we'll show you how you can react off rule triggers and take some action, e.g. stop the training job through CloudWatch Events.\n", + "\n", + "## How does Amazon SageMaker Debugger work?\n", + "\n", + "Amazon SageMaker Debugger lets you go beyond just looking at scalars like losses and accuracies during training and gives you full visibility into all tensors 'flowing through the graph' during training. Furthermore, it helps you monitor your training in near real time using rules and provides you alerts, once it has detected inconsistency in training flow.\n", + "\n", + "### Concepts\n", + "* **Tensors**: These represent the state of the training network at intermediate points during its execution\n", + "* **Debug Hook**: Hook is the construct with which Amazon SageMaker Debugger looks into the training process and captures the tensors requested at the desired step intervals\n", + "* **Rule**: A logical construct, implemented as Python code, which helps analyze the tensors captured by the hook and report anomalies, if at all\n", + "\n", + "With these concepts in mind, let's understand the overall flow of things that Amazon SageMaker Debugger uses to orchestrate debugging.\n", + "\n", + "### Saving tensors during training\n", + "\n", + "The tensors captured by the debug hook are stored in the S3 location specified by you. There are two ways you can configure Amazon SageMaker Debugger to save tensors:\n", + "\n", + "#### With no changes to your training script\n", + "If you use one of the SageMaker provided [Deep Learning Containers](https://docs.aws.amazon.com/sagemaker/latest/dg/pre-built-containers-frameworks-deep-learning.html) for 1.15, then you don't need to make any changes to your training script for the tensors to be stored. SageMaker Debugger will use the configuration you provide through the SageMaker SDK's Tensorflow `Estimator` when creating your job to save the tensors in the fashion you specify. You can review the script we are going to use at [src/mnist_zerocodechange.py](src/mnist_zerocodechange.py). You will note that this is an untouched TensorFlow script which uses the Estimator interface. Please note that SageMaker Debugger only supports `tf.keras`, `tf.Estimator` and `tf.MonitoredSession` interfaces. Full description of support is available at [SageMaker Debugger with TensorFlow ](https://github.com/awslabs/sagemaker-debugger/tree/master/docs/tensorflow.md)\n", + "\n", + "#### Orchestrating your script to store tensors\n", + "For other containers, you need to make couple of lines of changes to your training script. SageMaker Debugger exposes a library `smdebug` which allows you to capture these tensors and save them for analysis. It's highly customizable and allows to save the specific tensors you want at different frequencies and possibly with other configurations. Refer [DeveloperGuide](https://github.com/awslabs/sagemaker-debugger/tree/master/docs) for details on how to use SageMaker Debugger library with your choice of framework in your training script. Here we have an example script orchestrated at [src/mnist_byoc](src/mnist_byoc.py). You also need to ensure that your container has the `smdebug` library installed.\n", + "\n", + "### Analysis of tensors\n", + "\n", + "Once the tensors are saved, Amazon SageMaker Debugger can be configured to run debugging ***Rules*** on them. At a very broad level, a rule is Python code used to detect certain conditions during training. Some of the conditions that a data scientist training an algorithm may care about are monitoring for gradients getting too large or too small, detecting overfitting, and so on. Sagemaker Debugger comes pre-packaged with certain built-in rules. Users can write their own rules using the Sagemaker Debugger APIs. You can also analyze raw tensor data outside of the Rules construct in say, a Sagemaker notebook, using Amazon Sagemaker Debugger's full set of APIs. Please refer [Analysis Developer Guide](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/api.md) for more on these APIs.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Cloudwatch Events for Rules\n", + "Rule status changes in a training job trigger CloudWatch Events. These events can be acted upon by configuring a CloudWatch Rule (different from Amazon SageMaker Debugger Rule) to trigger each time a Debugger Rule changes status. In this notebook we'll go through how you can create a CloudWatch Rule to direct Training Job State change events to a lambda function that stops the training job in case a rule triggers and has status `\"IssuesFound\"`\n", + "\n", + "#### Lambda Function\n", + "\n", + "* In your AWS console, go to Lambda Management Console,\n", + "* Create a new function by hitting Create Function,\n", + "* Choose the language as Python 3.7 and put in the following sample code for stopping the training job if one of the Rule statuses is `\"IssuesFound\"`:\n", + "\n", + "```python\n", + "import json\n", + "import boto3\n", + "import logging\n", + "\n", + "def lambda_handler(event, context):\n", + " training_job_name = event.get(\"detail\").get(\"TrainingJobName\")\n", + " eval_statuses = event.get(\"detail\").get(\"DebugRuleEvaluationStatuses\", None)\n", + "\n", + " if eval_statuses is None or len(eval_statuses) == 0:\n", + " logging.info(\"Couldn't find any debug rule statuses, skipping...\")\n", + " return {\n", + " 'statusCode': 200,\n", + " 'body': json.dumps('Nothing to do')\n", + " }\n", + "\n", + " client = boto3.client('sagemaker')\n", + "\n", + " for status in eval_statuses:\n", + " if status.get(\"RuleEvaluationStatus\") == \"IssuesFound\":\n", + " logging.info(\n", + " 'Evaluation of rule configuration {} resulted in \"IssuesFound\". '\n", + " 'Attempting to stop training job {}'.format(\n", + " status.get(\"RuleConfigurationName\"), training_job_name\n", + " )\n", + " )\n", + " try:\n", + " client.stop_training_job(\n", + " TrainingJobName=training_job_name\n", + " )\n", + " except Exception as e:\n", + " logging.error(\n", + " \"Encountered error while trying to \"\n", + " \"stop training job {}: {}\".format(\n", + " training_job_name, str(e)\n", + " )\n", + " )\n", + " raise e\n", + " return None\n", + "```\n", + "* Create a new execution role for the Lambda, and\n", + "* In your IAM console, search for the role and attach \"AmazonSageMakerFullAccess\" policy to the role. This is needed for the code in your Lambda function to stop the training job.\n", + "\n", + "#### Create a CloudWatch Rule\n", + "\n", + "* In your AWS Console, go to CloudWatch and select Rule from the left column,\n", + "* Hit Create Rule. The console will redirect you to the Rule creation page,\n", + " * For the Service Name, select \"SageMaker\".\n", + " * For the Event Type, select \"SageMaker Training Job State Change\".\n", + "* In the Targets select the Lambda function you created above, and\n", + "* For this example notebook, we'll leave everything as is." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "import os\n", + "import sagemaker\n", + "from sagemaker.tensorflow import TensorFlow" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.debugger import Rule, rule_configs" + ] + }, + { + "cell_type": "code", + "execution_count": 151, + "metadata": {}, + "outputs": [], + "source": [ + "# define the entrypoint script\n", + "entrypoint_script='src/mnist_zerocodechange.py'\n", + "\n", + "# these hyperparameters ensure that vanishing gradient will trigger for our tensorflow mnist script\n", + "hyperparameters = {\n", + " \"num_epochs\": \"10\",\n", + " \"lr\": \"10.00\"\n", + "}" + ] + }, + { + "cell_type": "code", + "execution_count": 154, + "metadata": {}, + "outputs": [], + "source": [ + "rules=[\n", + " Rule.sagemaker(rule_configs.vanishing_gradient()), \n", + " Rule.sagemaker(rule_configs.loss_not_decreasing())\n", + "]\n", + "\n", + "estimator = TensorFlow(\n", + " role=sagemaker.get_execution_role(),\n", + " base_job_name='smdebugger-demo-mnist-tensorflow',\n", + " train_instance_count=1,\n", + " train_instance_type='ml.m4.xlarge',\n", + " entry_point=entrypoint_script,\n", + " framework_version='1.15',\n", + " train_volume_size=400,\n", + " py_version='py3',\n", + " train_max_run=3600,\n", + " script_mode=True,\n", + " hyperparameters=hyperparameters,\n", + " ## New parameter\n", + " rules = rules\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": 155, + "metadata": {}, + "outputs": [], + "source": [ + "# After calling fit, SageMaker will spin off 1 training job and 1 rule job for you\n", + "# The rule evaluation status(es) will be visible in the training logs\n", + "# at regular intervals\n", + "# wait=False makes this a fire and forget function. To stream the logs in the notebook leave this out\n", + "\n", + "estimator.fit(wait=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Monitoring\n", + "\n", + "SageMaker kicked off rule evaluation jobs, one for each of the SageMaker rules - `VanishingGradient` and `LossNotDecreasing` as specified in the estimator. \n", + "Given that we've tweaked the hyperparameters of our training script such that `VanishingGradient` is bound to fire, we should expect to see the `TrainingJobStatus` as\n", + "`Stopped` once the `RuleEvaluationStatus` for `VanishingGradient` changes to `IssuesFound`" + ] + }, + { + "cell_type": "code", + "execution_count": 191, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[{'RuleConfigurationName': 'VanishingGradient',\n", + " 'RuleEvaluationJobArn': 'arn:aws:sagemaker:us-east-2:072677473360:processing-job/smdebugger-demo-mnist-tens-vanishinggradient-e23301a8',\n", + " 'RuleEvaluationStatus': 'IssuesFound',\n", + " 'StatusDetails': 'RuleEvaluationConditionMet: Evaluation of the rule VanishingGradient at step 500 resulted in the condition being met\\n',\n", + " 'LastModifiedTime': datetime.datetime(2019, 12, 1, 7, 20, 55, 495000, tzinfo=tzlocal())},\n", + " {'RuleConfigurationName': 'LossNotDecreasing',\n", + " 'RuleEvaluationJobArn': 'arn:aws:sagemaker:us-east-2:072677473360:processing-job/smdebugger-demo-mnist-tens-lossnotdecreasing-27ee2da1',\n", + " 'RuleEvaluationStatus': 'InProgress',\n", + " 'LastModifiedTime': datetime.datetime(2019, 12, 1, 7, 20, 55, 495000, tzinfo=tzlocal())}]" + ] + }, + "execution_count": 191, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# rule job summary gives you the summary of the rule evaluations. You might have to run it over \n", + "# a few times before you start to see all values populated/changing\n", + "estimator.latest_training_job.rule_job_summary()" + ] + }, + { + "cell_type": "code", + "execution_count": 194, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'VanishingGradient': 'https://us-east-2.console.aws.amazon.com/cloudwatch/home?region=us-east-2#logStream:group=/aws/sagemaker/ProcessingJobs;prefix=smdebugger-demo-mnist-tens-VanishingGradient-e23301a8;streamFilter=typeLogStreamPrefix',\n", + " 'LossNotDecreasing': 'https://us-east-2.console.aws.amazon.com/cloudwatch/home?region=us-east-2#logStream:group=/aws/sagemaker/ProcessingJobs;prefix=smdebugger-demo-mnist-tens-LossNotDecreasing-27ee2da1;streamFilter=typeLogStreamPrefix'}" + ] + }, + "execution_count": 194, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "# This utility gives the link to monitor the CW event\n", + "def _get_rule_job_name(training_job_name, rule_configuration_name, rule_job_arn):\n", + " \"\"\"Helper function to get the rule job name\"\"\"\n", + " return \"{}-{}-{}\".format(\n", + " training_job_name[:26], rule_configuration_name[:26], rule_job_arn[-8:]\n", + " )\n", + " \n", + "def _get_cw_url_for_rule_job(rule_job_name, region):\n", + " return \"https://{}.console.aws.amazon.com/cloudwatch/home?region={}#logStream:group=/aws/sagemaker/ProcessingJobs;prefix={};streamFilter=typeLogStreamPrefix\".format(region, region, rule_job_name)\n", + "\n", + "\n", + "def get_rule_jobs_cw_urls(estimator):\n", + " region = boto3.Session().region_name\n", + " training_job = estimator.latest_training_job\n", + " training_job_name = training_job.describe()[\"TrainingJobName\"]\n", + " rule_eval_statuses = training_job.describe()[\"DebugRuleEvaluationStatuses\"]\n", + " \n", + " result={}\n", + " for status in rule_eval_statuses:\n", + " if status.get(\"RuleEvaluationJobArn\", None) is not None:\n", + " rule_job_name = _get_rule_job_name(training_job_name, status[\"RuleConfigurationName\"], status[\"RuleEvaluationJobArn\"])\n", + " result[status[\"RuleConfigurationName\"]] = _get_cw_url_for_rule_job(rule_job_name, region)\n", + " return result\n", + "\n", + "get_rule_jobs_cw_urls(estimator)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After running the last two cells over and until `VanishingGradient` reports `IssuesFound`, we'll attempt to describe the `TrainingJobStatus` for our training job." + ] + }, + { + "cell_type": "code", + "execution_count": 193, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "'Stopped'" + ] + }, + "execution_count": 193, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "estimator.latest_training_job.describe()[\"TrainingJobStatus\"]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Result\n", + "\n", + "This notebook attempted to show a very simple setup of how you can use CloudWatch events for your training job to take action on rule evaluation status changes. Learn more about Amazon SageMaker Debugger in the [GitHub Documentation](https://github.com/awslabs/sagemaker-debugger)." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_tensorflow_p36", + "language": "python", + "name": "conda_tensorflow_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/sagemaker-debugger/tensorflow_builtin_rule/src/mnist_byoc.py b/sagemaker-debugger/tensorflow_builtin_rule/src/mnist_byoc.py new file mode 100644 index 0000000000..4454436d86 --- /dev/null +++ b/sagemaker-debugger/tensorflow_builtin_rule/src/mnist_byoc.py @@ -0,0 +1,136 @@ +""" +This script is a simple MNIST training script which uses Tensorflow's Estimator interface. +It has been orchestrated with SageMaker Debugger hooks to allow saving tensors during training. +These hooks have been instrumented to read from json configuration that SageMaker will put in the training container. +Configuration provided to the SageMaker python SDK when creating a job will be passed on to the hook. +This allows you to use the same script with differing configurations across different runs. +If you use an official SageMaker Framework container (i.e. AWS Deep Learning Container), then +you do not have to orchestrate your script as below. Hooks will automatically be added in those environments. +For more information, please refer to https://github.com/awslabs/sagemaker-debugger/blob/master/docs/sagemaker.md +""" + +# Standard Library +import argparse +import random + +# Third Party +import numpy as np +import tensorflow as tf +import smdebug.tensorflow as smd + +import logging +logging.getLogger().setLevel(logging.INFO) + +parser = argparse.ArgumentParser() +parser.add_argument("--lr", type=float, default=0.001) +parser.add_argument("--random_seed", type=bool, default=False) +parser.add_argument("--num_epochs", type=int, default=5, help="Number of epochs to train for") +parser.add_argument( + "--num_steps", + type=int, + help="Number of steps to train for. If this" "is passed, it overrides num_epochs", +) +parser.add_argument( + "--num_eval_steps", + type=int, + help="Number of steps to evaluate for. If this" + "is passed, it doesnt evaluate over the full eval set", +) +parser.add_argument("--model_dir", type=str, default="/tmp/mnist_model") +args = parser.parse_args() + +if args.random_seed: + tf.set_random_seed(2) + np.random.seed(2) + random.seed(12) + +# This allows you to create the hook from the configuration you pass to the SageMaker pySDK +hook = smd.SessionHook.create_from_json_file() + +def cnn_model_fn(features, labels, mode): + """Model function for CNN.""" + # Input Layer + input_layer = tf.reshape(features["x"], [-1, 28, 28, 1]) + + # Convolutional Layer #1 + conv1 = tf.layers.conv2d( + inputs=input_layer, filters=32, kernel_size=[5, 5], padding="same", activation=tf.nn.relu + ) + + # Pooling Layer #1 + pool1 = tf.layers.max_pooling2d(inputs=conv1, pool_size=[2, 2], strides=2) + + # Convolutional Layer #2 and Pooling Layer #2 + conv2 = tf.layers.conv2d( + inputs=pool1, filters=64, kernel_size=[5, 5], padding="same", activation=tf.nn.relu + ) + pool2 = tf.layers.max_pooling2d(inputs=conv2, pool_size=[2, 2], strides=2) + + # Dense Layer + pool2_flat = tf.reshape(pool2, [-1, 7 * 7 * 64]) + dense = tf.layers.dense(inputs=pool2_flat, units=1024, activation=tf.nn.relu) + dropout = tf.layers.dropout( + inputs=dense, rate=0.4, training=mode == tf.estimator.ModeKeys.TRAIN + ) + + # Logits Layer + logits = tf.layers.dense(inputs=dropout, units=10) + + predictions = { + # Generate predictions (for PREDICT and EVAL mode) + "classes": tf.argmax(input=logits, axis=1), + # Add `softmax_tensor` to the graph. It is used for PREDICT and by the + # `logging_hook`. + "probabilities": tf.nn.softmax(logits, name="softmax_tensor"), + } + + if mode == tf.estimator.ModeKeys.PREDICT: + return tf.estimator.EstimatorSpec(mode=mode, predictions=predictions) + + # Calculate Loss (for both TRAIN and EVAL modes) + loss = tf.losses.sparse_softmax_cross_entropy(labels=labels, logits=logits) + + # Configure the Training Op (for TRAIN mode) + if mode == tf.estimator.ModeKeys.TRAIN: + optimizer = tf.train.GradientDescentOptimizer(learning_rate=args.lr) + + # SMD: Wrap your optimizer as follows to help SageMaker Debugger identify gradients + # This does not change your optimization logic, it returns back the same optimizer + optimizer = hook.wrap_optimizer(optimizer) + + train_op = optimizer.minimize(loss=loss, global_step=tf.train.get_global_step()) + return tf.estimator.EstimatorSpec(mode=mode, loss=loss, train_op=train_op) + + # Add evaluation metrics (for EVAL mode) + eval_metric_ops = { + "accuracy": tf.metrics.accuracy(labels=labels, predictions=predictions["classes"]) + } + return tf.estimator.EstimatorSpec(mode=mode, loss=loss, eval_metric_ops=eval_metric_ops) + + +# Load training and eval data +((train_data, train_labels), (eval_data, eval_labels)) = tf.keras.datasets.mnist.load_data() + +train_data = train_data / np.float32(255) +train_labels = train_labels.astype(np.int32) # not required + +eval_data = eval_data / np.float32(255) +eval_labels = eval_labels.astype(np.int32) # not required + +mnist_classifier = tf.estimator.Estimator(model_fn=cnn_model_fn, model_dir=args.model_dir) + +train_input_fn = tf.estimator.inputs.numpy_input_fn( + x={"x": train_data}, y=train_labels, batch_size=128, num_epochs=args.num_epochs, shuffle=True +) + +eval_input_fn = tf.estimator.inputs.numpy_input_fn( + x={"x": eval_data}, y=eval_labels, num_epochs=1, shuffle=False +) + +# Set training mode so SMDebug can classify the steps into training mode +hook.set_mode(smd.modes.TRAIN) +mnist_classifier.train(input_fn=train_input_fn, steps=args.num_steps, hooks=[hook]) + +# Set eval mode so SMDebug can classify the steps into eval mode +hook.set_mode(smd.modes.EVAL) +mnist_classifier.evaluate(input_fn=eval_input_fn, steps=args.num_eval_steps, hooks=[hook]) diff --git a/sagemaker-debugger/tensorflow_builtin_rule/src/mnist_zerocodechange.py b/sagemaker-debugger/tensorflow_builtin_rule/src/mnist_zerocodechange.py new file mode 100644 index 0000000000..5107e3e2a7 --- /dev/null +++ b/sagemaker-debugger/tensorflow_builtin_rule/src/mnist_zerocodechange.py @@ -0,0 +1,124 @@ +""" +This script is a simple MNIST training script which uses Tensorflow's Estimator interface. +It is designed to be used with SageMaker Debugger in an official SageMaker Framework container (i.e. AWS Deep Learning Container). You will notice that this script looks exactly like a normal TensorFlow training script. +The hook needed by SageMaker Debugger to save tensors during training will be automatically added in those environments. +The hook will load configuration from json configuration that SageMaker will put in the training container from the configuration provided using the SageMaker python SDK when creating a job. +For more information, please refer to https://github.com/awslabs/sagemaker-debugger/blob/master/docs/sagemaker.md +""" + +# Standard Library +import argparse +import random + +# Third Party +import numpy as np +import tensorflow as tf + +import logging +logging.getLogger().setLevel(logging.INFO) + +parser = argparse.ArgumentParser() +parser.add_argument("--lr", type=float, default=0.001) +parser.add_argument("--random_seed", type=bool, default=False) +parser.add_argument("--num_epochs", type=int, default=5, help="Number of epochs to train for") +parser.add_argument( + "--num_steps", + type=int, + help="Number of steps to train for. If this" "is passed, it overrides num_epochs", +) +parser.add_argument( + "--num_eval_steps", + type=int, + help="Number of steps to evaluate for. If this" + "is passed, it doesnt evaluate over the full eval set", +) +parser.add_argument("--model_dir", type=str, default="/tmp/mnist_model") +args = parser.parse_args() + +# these random seeds are only intended for test purpose. +# for now, 2,2,12 could promise no assert failure when running tests. +# if you wish to change the number, notice that certain steps' tensor value may be capable of variation +if args.random_seed: + tf.set_random_seed(2) + np.random.seed(2) + random.seed(12) + + +def cnn_model_fn(features, labels, mode): + """Model function for CNN.""" + # Input Layer + input_layer = tf.reshape(features["x"], [-1, 28, 28, 1]) + + # Convolutional Layer #1 + conv1 = tf.layers.conv2d( + inputs=input_layer, filters=32, kernel_size=[5, 5], padding="same", activation=tf.nn.relu + ) + + # Pooling Layer #1 + pool1 = tf.layers.max_pooling2d(inputs=conv1, pool_size=[2, 2], strides=2) + + # Convolutional Layer #2 and Pooling Layer #2 + conv2 = tf.layers.conv2d( + inputs=pool1, filters=64, kernel_size=[5, 5], padding="same", activation=tf.nn.relu + ) + pool2 = tf.layers.max_pooling2d(inputs=conv2, pool_size=[2, 2], strides=2) + + # Dense Layer + pool2_flat = tf.reshape(pool2, [-1, 7 * 7 * 64]) + dense = tf.layers.dense(inputs=pool2_flat, units=1024, activation=tf.nn.relu) + dropout = tf.layers.dropout( + inputs=dense, rate=0.4, training=mode == tf.estimator.ModeKeys.TRAIN + ) + + # Logits Layer + logits = tf.layers.dense(inputs=dropout, units=10) + + predictions = { + # Generate predictions (for PREDICT and EVAL mode) + "classes": tf.argmax(input=logits, axis=1), + # Add `softmax_tensor` to the graph. It is used for PREDICT and by the + # `logging_hook`. + "probabilities": tf.nn.softmax(logits, name="softmax_tensor"), + } + + if mode == tf.estimator.ModeKeys.PREDICT: + return tf.estimator.EstimatorSpec(mode=mode, predictions=predictions) + + # Calculate Loss (for both TRAIN and EVAL modes) + loss = tf.losses.sparse_softmax_cross_entropy(labels=labels, logits=logits) + + # Configure the Training Op (for TRAIN mode) + if mode == tf.estimator.ModeKeys.TRAIN: + optimizer = tf.train.GradientDescentOptimizer(learning_rate=args.lr) + train_op = optimizer.minimize(loss=loss, global_step=tf.train.get_global_step()) + return tf.estimator.EstimatorSpec(mode=mode, loss=loss, train_op=train_op) + + # Add evaluation metrics (for EVAL mode) + eval_metric_ops = { + "accuracy": tf.metrics.accuracy(labels=labels, predictions=predictions["classes"]) + } + return tf.estimator.EstimatorSpec(mode=mode, loss=loss, eval_metric_ops=eval_metric_ops) + + +# Load training and eval data +((train_data, train_labels), (eval_data, eval_labels)) = tf.keras.datasets.mnist.load_data() + +train_data = train_data / np.float32(255) +train_labels = train_labels.astype(np.int32) # not required + +eval_data = eval_data / np.float32(255) +eval_labels = eval_labels.astype(np.int32) # not required + +mnist_classifier = tf.estimator.Estimator(model_fn=cnn_model_fn, model_dir=args.model_dir) + +train_input_fn = tf.estimator.inputs.numpy_input_fn( + x={"x": train_data}, y=train_labels, batch_size=128, num_epochs=args.num_epochs, shuffle=True +) + +eval_input_fn = tf.estimator.inputs.numpy_input_fn( + x={"x": eval_data}, y=eval_labels, num_epochs=1, shuffle=False +) + +mnist_classifier.train(input_fn=train_input_fn, steps=args.num_steps) + +mnist_classifier.evaluate(input_fn=eval_input_fn, steps=args.num_eval_steps) diff --git a/sagemaker-debugger/tensorflow_builtin_rule/tf-mnist-builtin-rule.ipynb b/sagemaker-debugger/tensorflow_builtin_rule/tf-mnist-builtin-rule.ipynb new file mode 100644 index 0000000000..9a5a2ca221 --- /dev/null +++ b/sagemaker-debugger/tensorflow_builtin_rule/tf-mnist-builtin-rule.ipynb @@ -0,0 +1,435 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Amazon SageMaker Debugger - Using built-in rule\n", + "[Amazon SageMaker](https://aws.amazon.com/sagemaker/) is managed platform to build, train and host maching learning models. Amazon SageMaker Debugger is a new feature which offers the capability to debug machine learning models during training by identifying and detecting problems with the models in near real-time. \n", + "\n", + "In this notebook you'll be looking at how to use a SageMaker provided built in rule during a TensorFlow training job.\n", + "\n", + "## How does Amazon SageMaker Debugger work?\n", + "\n", + "Amazon SageMaker Debugger lets you go beyond just looking at scalars like losses and accuracies during training and gives you full visibility into all tensors 'flowing through the graph' during training. Furthermore, it helps you monitor your training in near real-time using rules and provides you alerts, once it has detected inconsistency in training flow.\n", + "\n", + "### Concepts\n", + "* **Tensors**: These represent the state of the training network at intermediate points during its execution\n", + "* **Debug Hook**: Hook is the construct with which Amazon SageMaker Debugger looks into the training process and captures the tensors requested at the desired step intervals\n", + "* **Rule**: A logical construct, implemented as Python code, which helps analyze the tensors captured by the hook and report anomalies, if at all\n", + "\n", + "With these concepts in mind, let's understand the overall flow of things that Amazon SageMaker Debugger uses to orchestrate debugging\n", + "\n", + "### Saving tensors during training\n", + "\n", + "The tensors captured by the debug hook are stored in the S3 location specified by you. There are two ways you can configure Amazon SageMaker Debugger to save tensors:\n", + "\n", + "#### With no changes to your training script\n", + "If you use one of Amazon SageMaker provided [Deep Learning Containers](https://docs.aws.amazon.com/sagemaker/latest/dg/pre-built-containers-frameworks-deep-learning.html) for 1.15, then you don't need to make any changes to your training script for the tensors to be stored. Amazon SageMaker Debugger will use the configuration you provide through Amazon SageMaker SDK's Tensorflow `Estimator` when creating your job to save the tensors in the fashion you specify. You can review the script we are going to use at [src/mnist_zerocodechange.py](src/mnist_zerocodechange.py). You will note that this is an untouched TensorFlow script which uses the `tf.estimator` interface. Please note that Amazon SageMaker Debugger only supports `tf.keras`, `tf.Estimator` and `tf.MonitoredSession` interfaces. Full description of support is available at [Amazon SageMaker Debugger with TensorFlow ](https://github.com/awslabs/sagemaker-debugger/tree/master/docs/tensorflow.md)\n", + "\n", + "#### Orchestrating your script to store tensors\n", + "For other containers, you need to make couple of lines of changes to your training script. The Amazon SageMaker Debugger exposes a library `smdebug` which allows you to capture these tensors and save them for analysis. It's highly customizable and allows to save the specific tensors you want at different frequencies and possibly with other configurations. Refer [DeveloperGuide](https://github.com/awslabs/sagemaker-debugger/tree/master/docs) for details on how to use the Debugger library with your choice of framework in your training script. Here we have an example script orchestrated at [src/mnist_byoc](src/mnist_byoc.py). You also need to ensure that your container has the `smdebug` library installed.\n", + "\n", + "### Analysis of tensors\n", + "\n", + "Once the tensors are saved, Amazon SageMaker Debugger can be configured to run debugging ***Rules*** on them. At a very broad level, a rule is python code used to detect certain conditions during training. Some of the conditions that a data scientist training an algorithm may care about are monitoring for gradients getting too large or too small, detecting overfitting, and so on. Amazon Sagemaker Debugger will come pre-packaged with certain first-party (1P) rules. Users can write their own rules using Amazon Sagemaker Debugger APIs. You can also analyze raw tensor data outside of the Rules construct in say, a Sagemaker notebook, using Amazon Sagemaker Debugger's full set of APIs. This notebook will show you how to use a built in SageMaker Rule with your training job as well as provide a sneak peak into these APIs for interactive exploration. Please refer [Analysis Developer Guide](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/api.md) for more on these APIs.\n", + "\n", + "## Setup\n", + "\n", + "Follow this one time setup to get your notebook up and running to use Amazon SageMaker Debugger. This is only needed because we plan to perform interactive analysis using this library in the notebook. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "! pip install smdebug" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the setup out of the way let's start training our TensorFlow model in SageMaker with the debugger enabled.\n", + "\n", + "## Training TensorFlow models in SageMaker with Amazon SageMaker Debugger\n", + "\n", + "### SageMaker TensorFlow as a framework\n", + "\n", + "We'll train a TensorFlow model in this notebook with Amazon Sagemaker Debugger enabled and monitor the training jobs with Amazon Sagemaker Debugger Rules. This will be done using Amazon SageMaker [TensorFlow 1.15.0](https://docs.aws.amazon.com/sagemaker/latest/dg/pre-built-containers-frameworks-deep-learning.html) Container as a framework.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "import os\n", + "import sagemaker\n", + "from sagemaker.tensorflow import TensorFlow" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's import the libraries needed for our demo of Amazon SageMaker Debugger." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.debugger import Rule, DebuggerHookConfig, TensorBoardOutputConfig, CollectionConfig, rule_configs" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we'll define the configuration for our training to run. We'll using image recognition using MNIST dataset as our training example." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# define the entrypoint script\n", + "entrypoint_script='src/mnist_zerocodechange.py'\n", + "\n", + "hyperparameters = {\n", + " \"num_epochs\": 3\n", + "}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Setting up the Estimator\n", + "\n", + "Now it's time to setup our TensorFlow estimator. We've added new parameters to the estimator to enable your training job for debugging through Amazon SageMaker Debugger. These new parameters are explained below.\n", + "\n", + "* **debugger_hook_config**: This new parameter accepts a local path where you wish your tensors to be written to and also accepts the S3 URI where you wish your tensors to be uploaded to. SageMaker will take care of uploading these tensors transparently during execution.\n", + "* **rules**: This new parameter will accept a list of rules you wish to evaluate against the tensors output by this training job. For rules, Amazon SageMaker Debugger supports two types:\n", + " * **SageMaker Rules**: These are rules specially curated by the data science and engineering teams in Amazon SageMaker which you can opt to evaluate against your training job.\n", + " * **Custom Rules**: You can optionally choose to write your own rule as a Python source file and have it evaluated against your training job. To provide Amazon SageMaker Debugger to evaluate this rule, you would have to provide the S3 location of the rule source and the evaluator image.\n", + " \n", + "#### Using Amazon SageMaker Rules\n", + " \n", + "In this example we'll demonstrate how to use SageMaker rules to be evaluated against your training. You can find the list of SageMaker rules and the configurations best suited for using them [here](https://github.com/awslabs/sagemaker-debugger-rulesconfig).\n", + "\n", + "The rules we'll use are **VanishingGradient** and **LossNotDecreasing**. As the names suggest, the rules will attempt to evaluate if there are vanishing gradients in the tensors captured by the debugging hook during training and also if the loss is not decreasing." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "rules = [\n", + " Rule.sagemaker(rule_configs.vanishing_gradient()), \n", + " Rule.sagemaker(rule_configs.loss_not_decreasing())\n", + "]\n", + "\n", + "estimator = TensorFlow(\n", + " role=sagemaker.get_execution_role(),\n", + " base_job_name='smdebugger-demo-mnist-tensorflow',\n", + " train_instance_count=1,\n", + " train_instance_type='ml.m4.xlarge',\n", + " train_volume_size=400,\n", + " entry_point=entrypoint_script,\n", + " framework_version='1.15',\n", + " py_version='py3',\n", + " train_max_run=3600,\n", + " script_mode=True,\n", + " hyperparameters=hyperparameters,\n", + " ## New parameter\n", + " rules = rules\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "*Note that Amazon Sagemaker Debugger is only supported for py_version='py3' currently.*\n", + "\n", + "Let's start the training by calling `fit()` on the TensorFlow estimator." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator.fit(wait=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Result \n", + "\n", + "As a result of calling the `fit()` Amazon SageMaker Debugger kicked off two rule evaluation jobs to monitor vanishing gradient and loss decrease, in parallel with the training job. The rule evaluation status(es) will be visible in the training logs at regular intervals. As you can see, in the summary, there was no step in the training which reported vanishing gradients in the tensors. Although, the loss was not found to be decreasing at step 1900." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "[{'RuleConfigurationName': 'VanishingGradient',\n", + " 'RuleEvaluationJobArn': 'arn:aws:sagemaker:us-west-2:072677473360:processing-job/smdebugger-demo-mnist-tens-vanishinggradient-1db16b4d',\n", + " 'RuleEvaluationStatus': 'NoIssuesFound',\n", + " 'LastModifiedTime': datetime.datetime(2019, 12, 1, 23, 47, 32, 186000, tzinfo=tzlocal())},\n", + " {'RuleConfigurationName': 'LossNotDecreasing',\n", + " 'RuleEvaluationJobArn': 'arn:aws:sagemaker:us-west-2:072677473360:processing-job/smdebugger-demo-mnist-tens-lossnotdecreasing-d6176866',\n", + " 'RuleEvaluationStatus': 'NoIssuesFound',\n", + " 'LastModifiedTime': datetime.datetime(2019, 12, 1, 23, 47, 32, 186000, tzinfo=tzlocal())}]" + ] + }, + "execution_count": 6, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "estimator.latest_training_job.rule_job_summary()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's try and look at the logs of the rule job for loss not decreasing. To do that, we'll use this utlity function to get a link to the rule job logs." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "{'VanishingGradient': 'https://us-west-2.console.aws.amazon.com/cloudwatch/home?region=us-west-2#logStream:group=/aws/sagemaker/ProcessingJobs;prefix=smdebugger-demo-mnist-tens-VanishingGradient-1db16b4d;streamFilter=typeLogStreamPrefix',\n", + " 'LossNotDecreasing': 'https://us-west-2.console.aws.amazon.com/cloudwatch/home?region=us-west-2#logStream:group=/aws/sagemaker/ProcessingJobs;prefix=smdebugger-demo-mnist-tens-LossNotDecreasing-d6176866;streamFilter=typeLogStreamPrefix'}" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "def _get_rule_job_name(training_job_name, rule_configuration_name, rule_job_arn):\n", + " \"\"\"Helper function to get the rule job name with correct casing\"\"\"\n", + " return \"{}-{}-{}\".format(\n", + " training_job_name[:26], rule_configuration_name[:26], rule_job_arn[-8:]\n", + " )\n", + " \n", + "def _get_cw_url_for_rule_job(rule_job_name, region):\n", + " return \"https://{}.console.aws.amazon.com/cloudwatch/home?region={}#logStream:group=/aws/sagemaker/ProcessingJobs;prefix={};streamFilter=typeLogStreamPrefix\".format(region, region, rule_job_name)\n", + "\n", + "\n", + "def get_rule_jobs_cw_urls(estimator):\n", + " region = boto3.Session().region_name\n", + " training_job = estimator.latest_training_job\n", + " training_job_name = training_job.describe()[\"TrainingJobName\"]\n", + " rule_eval_statuses = training_job.describe()[\"DebugRuleEvaluationStatuses\"]\n", + " \n", + " result={}\n", + " for status in rule_eval_statuses:\n", + " if status.get(\"RuleEvaluationJobArn\", None) is not None:\n", + " rule_job_name = _get_rule_job_name(training_job_name, status[\"RuleConfigurationName\"], status[\"RuleEvaluationJobArn\"])\n", + " result[status[\"RuleConfigurationName\"]] = _get_cw_url_for_rule_job(rule_job_name, region)\n", + " return result\n", + "\n", + "get_rule_jobs_cw_urls(estimator)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Data Analysis - Interactive Exploration\n", + "Now that we have trained a job, and looked at automated analysis through rules, let us also look at another aspect of Amazon SageMaker Debugger. It allows us to perform interactive exploration of the tensors saved in real time or after the job. Here we focus on after-the-fact analysis of the above job. We import the `smdebug` library, which defines a concept of Trial that represents a single training run. Note how we fetch the path to debugger artifacts for the above job." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[2019-12-01 23:47:58.201 ip-172-16-62-176:30695 INFO s3_trial.py:42] Loading trial debug-output at path s3://sagemaker-us-west-2-072677473360/smdebugger-demo-mnist-tensorflow-2019-12-01-23-41-02-486/debug-output\n" + ] + } + ], + "source": [ + "from smdebug.trials import create_trial\n", + "trial = create_trial(estimator.latest_job_debugger_artifacts_path())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can list all the tensors that were recorded to know what we want to plot. Each one of these names is the name of a tensor, which is auto-assigned by TensorFlow. In some frameworks where such names are not available, we try to create a name based on the layer's name and whether it is weight, bias, gradient, input or output." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[2019-12-01 23:49:10.433 ip-172-16-62-176:30695 INFO trial.py:197] Training has ended, will refresh one final time in 1 sec.\n", + "[2019-12-01 23:49:11.471 ip-172-16-62-176:30695 INFO trial.py:209] Loaded all steps\n" + ] + }, + { + "data": { + "text/plain": [ + "['gradients/conv2d/BiasAdd_grad/tuple/control_dependency_1:0',\n", + " 'gradients/conv2d/Conv2D_grad/tuple/control_dependency_1:0',\n", + " 'gradients/conv2d_1/BiasAdd_grad/tuple/control_dependency_1:0',\n", + " 'gradients/conv2d_1/Conv2D_grad/tuple/control_dependency_1:0',\n", + " 'gradients/dense/BiasAdd_grad/tuple/control_dependency_1:0',\n", + " 'gradients/dense/MatMul_grad/tuple/control_dependency_1:0',\n", + " 'gradients/dense_1/BiasAdd_grad/tuple/control_dependency_1:0',\n", + " 'gradients/dense_1/MatMul_grad/tuple/control_dependency_1:0',\n", + " 'sparse_softmax_cross_entropy_loss/value:0']" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "trial.tensor_names()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can also retrieve tensors by some default collections that `smdebug` creates from your training job. Here we are interested in the losses collection, so we can retrieve the names of tensors in losses collection as follows. Amazon SageMaker Debugger creates default collections such as weights, gradients, biases, losses automatically. You can also create custom collections from your tensors." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "['sparse_softmax_cross_entropy_loss/value:0']" + ] + }, + "execution_count": 12, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "trial.tensor_names(collection=\"losses\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAA1wAAAIWCAYAAABDUYx6AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAMTQAADE0B0s6tTgAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzs3XlcVXXixvHnsu8giygCIu6KgKLm3qa5JmVqTmU5tjktZtY0M82MM/2mqVkcyrJpmWZ0Krc0y73FzMl9BzVzAQVFUQQFZF/u+f1hMjluoFwOl/t5v173pXAP5z73APfr4/me77UYhmEIAAAAAFDnnMwOAAAAAACNFYULAAAAAGyEwgUAAAAANkLhAgAAAAAboXABAAAAgI1QuAAAAADARihcAAAAAGAjFC4AAAAAsBEKFwAAAADYCIULAAAAAGzExewAN8rd3V0hISFmxwAAALVw+vRplZWVmR0DAGzO7gtXSEiIMjMzzY4BAABqITw83OwIAFAvmFIIAAAAADZC4QIAAAAAG7H7KYUAAABXYxhG9Q0A6pLFYpGT09XPYVG4AABAo2S1WpWdna28vDzKFgCbcXV1VWRkpNzc3C57P4ULAAA0ShkZGXJyclJUVJRcXV3NjgOgETIMQ7m5uTp69KjatGlz2W0oXAAAoNGxWq0qLS1V27Zt5eLCP3cA2E5QUJDOnDkjq9V62emFLJoBAAAanQtTCC0Wi8lJADR2F15nrjR1mcIFAAAAADZC4QIAAAAAG6FwAQAA4IYtWbJEHTt2VHx8vPbs2aPf//73Ki0tNTtWo9CQjqXFYlFeXl6d77eiokJRUVEqLy+/rq+fPXu27rrrrjrNZLVa9fTTT6t169Zq06aNZs6ceV37oXABAAA0AlVVVaY+/jvvvKNp06YpOTlZXbp00UsvvdRgSsK1WK1WWa1Ws2Nc0dWOZWVlZT2nsY21a9eqd+/eV1xa3QwfffSR9u3bp4MHD2rr1q3661//qu+++67W+6FwAQAAh/DIv7dpUNJ/bHJ75N/bapShpKRE9957rzp16qS4uDjdcccdWrt2rWJiYvTggw8qJiZGCQkJSk5OliSdPHlSt956qxISEtS5c2c99dRT1cVg9uzZuvXWW3XPPfeoS5cu2rp1q15++eXqs0zx8fHKyMiQJG3btk233Xabunfvrq5du2rhwoVXzbls2TLFxsYqPj5eMTExWrJkiSQpNTVVAwcOrL7vs88+kyRNnjxZ69at04svvqg+ffpo0qRJkqT+/fsrPj5e2dnZmjBhgh577DENHDhQrVq10sSJE7V161bdcsstio6O1tSpU6sfPykpST169FB8fLx69OihTZs2SZJOnz6tqKgobd68WZK0aNEixcXFqaSk5IrPJT8/X4888ohiYmIUFxeniRMnSjp/1uiee+7R4MGDFRMTo6ysLG3fvl19+vRRbGysevbsqQ0bNlQ/7h133KEuXbooNjZWP/3pTyVJmzdvVkJCQvVxevvtt696XK/0fUhPT1dAQIB+97vfKSEhQW3atNHKlSsl6YrHcuLEiRowYIBiYmIkSV988YW6deum2NhY3Xzzzdq3b58kXfXna8SIEZo7d251vi+//FI33XTTVZ/Dj9Xl8frss8909913a86cORoxYkT15w3DUHR0tFJSUq76+/Bja9euVXx8fPXHe/fuVVRUVPXHX3zxhfr166eEhAT17NlT33zzzWWf34IFC/Too4/K2dlZgYGBuvfeezVv3rwaH58fPwm71qJFC7MjAACAWrL1+F1ZWWns27fPqKysrP7cw7O3GgP/ttYmt4dnb61RrsWLFxt33HFH9ce5ubnGN998Y0gyVq9ebRiGYSxYsMBo3769YbVajZKSEuPcuXPVz2n48OHGvHnzDMMwjFmzZhmenp7G/v37DcMwjDNnzhj+/v5GcXGxYRiGUVRUZJSUlBhnz5414uPjjRMnThiGYRinT582IiIijMzMzCvmjI2NNTZu3GgYhmFUVVUZZ8+eNQzDMHr27Gm88847hmEYxsGDB43AwEAjPT3dMAzDuPnmm41PP/20eh+Sqr/OMAzjoYceMnr16mWUlJQYZWVlRuvWrY277rrLKC8vNwoLC42mTZsae/fuNQzDMLKzs6u/btOmTUb79u2rP/7222+N6OhoY8uWLUaLFi2MAwcOXPWYT5gwwfjZz35mVFVVXbTv3/3ud0bz5s2NkydPGoZhGGVlZUZERITx+eefG4ZhGOvWrTNCQ0ONc+fOGUlJScZjjz1Wvc/c3FzDMAxj5MiRxty5c6s/f+bMmSvmuNr34ciRI4YkY9GiRYZhGMaqVauMdu3aXfVYxsbGGgUFBYZhGMapU6eMwMBAY/fu3YZhGMZHH31kdOzY0bBarVf9+fryyy+N3r17V+935MiRxgcffHDV43khS10eL6vVakRHRxsFBQVGcXGxERQUZGRlZRmGYRhr1qwxunXrZhiGcc3fh8TERMMwDOObb74x4uLiqh9nz549RsuWLQ3DMIy0tDSjV69eRn5+vmEYhnHo0CGjWbNmRmlpqWEYhhEXF2ccP37cMAzDiImJqf49MAzDeOutt4zx48dfckwu93rzY7wxBQAAcAjvP9TD7AiKi4vT999/ryeeeEI333yzhg0bJkmKiorS7bffLkkaO3asHnvsMR07dkzBwcH6xS9+ofXr18swDGVnZysmJkbjxo2TJPXp00ft27eXJPn5+alt27Z64IEHdMcdd2j48OEKDw/XmjVrdPjwYQ0dOvSiLAcOHFCLFi0um/P222/XM888o9GjR+uOO+5QfHy8zp07p507d1afxWjbtq369eundevWqWXLljV6/omJifLw8JAkdenSRYMHD5arq6tcXV3VqVMnHTp0SJ07d9auXbv0xz/+Ubm5uXJxcdGBAwdUUlIiT09P9e/fXw8//LD69OmjDz74QO3atbvqYy5fvlxbtmypfn+kkJCQ6vuGDRum0NDQ6uPh5OSkwYMHS5L69eun0NBQJScnq1evXnrttdf03HPPacCAARoyZIgk6dZbb9Uf/vAHHTp0SLfddpv69et3xRwbN2684vchOjpaHh4eGjVqlCSpd+/eSktLu+rzGjNmjHx9fSVJW7ZsUZcuXdSlSxdJ0v33368nn3xSx48fl3Tln69BgwZpypQp2rVrlwIDA7V161Z9/PHHV33cH+euq+O1bds2dejQofr53HPPPfrwww/185//XLNnz64+Q2a1Wq/6+1ATn3/+uVJTUzVgwIDqzzk5Oeno0aNq27Zt9dm/usSUQgAAgHoSHR2tffv2aciQIdqwYYNiYmJ09uzZS7azWCyyWCxKSkpSdna2tmzZot27d+u+++676FoeHx+f6r87Oztr8+bNmjJlirKzs9WrVy+tW7dOhmGoc+fOSk5Orr4dPXpUt9122xVzJiUladasWfLy8tJDDz2kv/zlL5fdrrbvc3ahbF3I+78fV1ZWqry8XKNGjdL06dO1d+9effvtt5KksrKy6m137dqlkJAQHTt2rFaP/79+fPwu58Lz6927t5KTk3XTTTdp8eLF6tGjh6qqqjRlyhStWLFCzZs314svvqgnnnjiivu61vfB3d29+vGcnZ2veU3etbJf63ldeKzJkyfrzTff1DvvvKOJEyfK3d39hvYr1f54ffrppxcteDFx4kTNmjVLhYWFWr58ue677z5JuubvwwUuLi4XHb8fb2MYhgYNGnTR9+H48eNq27btJfuJjIysnpYrnZ/6GRkZWevjQuECAACoJ5mZmbJYLBo5cqSmT58uwzB07NgxpaenV19HsmjRIoWGhio8PFxnz55Vs2bN5OHhoZMnT1712qtz587p1KlT6t+/v37729+qX79+2rVrl/r06aMjR45o9erV1dsmJydfdTW4/fv3V18j87Of/UybN2+Wr6+vunXrplmzZkk6fz3X+vXrLzpT8GO+vr7Kz8+v9TEqLS1VeXl59T9s33zzzYvunzlzps6ePauUlBS9++671WfcruTCsb5wrc/p06cvu1379u1ltVr11VdfSTp/RurkyZOKj4/XkSNH5OPjo7Fjx+rNN9/UwYMHVVhYqAMHDqhVq1Z69NFH9eKLL1ZfW3Y51/N9uOBax7JXr17as2eP9u7dK0maP3++WrRoUX0G80o/X5I0fvx4ffHFF5o1a1b19WI1UZfHa+nSpUpMTKze94XryJ5//nkNHDhQgYGBklTj34fo6GhlZGRUf68//PDD6vsGDx6s1atXa/fu3dWf27p162X3M2bMGP3jH/9QVVWVzpw5owULFujee++t8TG6gCmFAAAA9WTPnj361a9+JcMwVFlZqfHjxys2NladO3fW7NmzNXnyZLm5uWnevHmyWCzV0/o6d+6ssLAwDRw48Ir7zs/P1+jRo1VUVCSLxaK2bdvqoYcekr+/v1asWKHnn39ezz33nCoqKhQZGVm94MXlvPjiizpw4IDc3Nzk5eVVvbjBnDlzNGnSJM2cOVMWi0Xvv//+Ff/H/7nnntOgQYPk5eWlL7/8ssbHyM/PTy+//LJ69uyp4ODgi6aL7dy5U9OnT9eWLVvUtGlTffTRR3rggQe0bds2BQUFXXZ/r732mp599ll16dJFrq6u6tGjh/7xj39csp2bm5sWL16syZMn67nnnpOHh4cWLVokHx8fLVy4UElJSdVn4f7617/K399fv/nNb7RmzRq5ubnJ2dlZf/vb3674vJo0aVLr78MF1zqWISEhmjNnjh588EFVVlaqSZMmWrhwYfUZpyv9fEmSl5eXRo0apRMnTigiIuKaWer6eO3fv19NmjRR06ZNL9r/T3/6U73wwgtatWpV9edq+vsQFhamF154QT179lRoaOhF0zjbtGmjuXPn6vHHH1dxcbHKy8vVtWvX6sVD4uPjtXLlSoWFhWn8+PHatm2b2rZtK4vFoqlTp1ZP26wNyw8Xv9mt8PBwZWZmmh0DAADUgq3H76qqKh08eFDt2rWTs7OzzR6nLqxdu1ZTpkyxybUjwLV+vqqqqpSQkKA333xT/fv3r+d00p/+9Ce5uLjo+eefr/fHrivXer1hSuFlGIahxTszlV9cYXYUAAAAwCaWLl2q1q1bq3fv3qaULUn65S9/addlqyY4w3UZ29PPaPQ7m+TqbNEt7ZsqMT5Mt3cIladbw/4fMgAA7AVnuMyXnJysCRMmXPL5hx56SM8++2z9B7oBK1eu1IsvvnjJ53/1q19d1zU3N2LSpEmXvZZr06ZN8vT0rNcs16sxPIf6dK3XGwrXZZwrrdDne09qacoJbUjNkdWQvN2cNTimmRLjW6hv6yC5OHNyEACA60XhAtBYXOv1hkUzLsPXw1VjukdoTPcIZZ8r1YrdWVqSfEKLdx7X4p3HFeTtphGxzTUyvoW6RQbUeklUAABgWxfGZjv/f2UAduDC68yVOoFNz3CVlpZq3Lhx2rdvnzw9PdW0aVO9/fbbatOmzUXbHTlyRKNHj1ZVVZUqKyvVsWNHvffee2rSpMk1H6M+F81IzynS0pQT+iz5uA6fLpIkRQR6amRcmBLjW6hdqG+95AAAwN7Vx/h95MgROTk5KTQ0VK6urjZ9LACOyTAM5ebm6ty5c5d0nAtsXrjWrFmjoUOHymKxaObMmVq0aJHWrl170XZlZWWyWq3Vc0KfeeYZSdKMGTOu+RhmrFJoGIa+O1GgpSkntDT5hE4WnH8ztQ7NfJUY30Ij48PUIoD5rQAAXEl9jN9Wq1XZ2dnKy8vjTBcAm3F1dVVkZKTc3Nwue3+9XsO1fft2jR49Wunp6VfcpqqqSo8//rh8fHz0+uuvX3OfZi8Lb7Ua2pp+RkuST2jlnizll5xf2bBnVKBGxodpWJfmCvS+/MEHAMBR1ef4bRhG9Q0A6pLFYpGT09XXdqjXwjV+/HgFBgZe9sxVeXm5evbsqYyMDMXGxmrp0qXy9/e/ZLukpCQlJSVVf1xYWKi8vDyb5q6pssoqfXswR0uSj2v196dUWmGVi5NFA9qFKDE+TIM6hcrLjcvmAAAw+z9MAaC+1FvheuWVV7Rs2TJ9/fXX8vLyuuJ25eXlevrpp9W6dWu98MIL19xvQ33BLiyr1Ff7TmpJ8gmtO5SjKqshT1dn3dE5VInxYerfNkSurHQIAHBQDXX8BoC6Vi+Fa/r06Zo/f75Wr16tgICAa26/efNmPfroo9qzZ881t7WHF+zcwjKt2HN+pcMdGWclSU28XDWsS3MlxrdQ95ZN5OTESocAAMdhD+M3ANQFmxeupKQkzZkzR6tXr77iqoMZGRkKCQmRl5eXrFarfv7zn+vkyZOaM2fONfdvby/Yx84Ua2nKCS1JPq6DpwolSS0CPHVnXJgS48PUoZkvy8wDABo9exu/AeB62bRwZWZmKiIiQtHR0fL1Pb9kuru7u7Zs2aJp06YpLCxMkyZN0rJly/TrX/9a0vkVhbp166bXXntNQUFB13wMe37B3n+yQEuSz690eDyvRJLULtTn/EqHcWGKCLzy1EsAAOyZPY/fAFAb9bpohi00hhdsq9XQjqNntST5uFbsztLZ4vMrHYb6uat1iI+iQ7x/+NNHrUO8FebvyRREAIBdawzjNwDUBIWrgamosmr9oRyt3JOl/SfP6fDpQhWVV120jYerk1oF/7eItf7hz1bB3vJ2ZxVEAEDD19jGbwC4EgpXA2cYhk4VlOnw6UKlnS5U2ukipZ0u1OHTRdXTEH+sub/Hf8+IBXurddPzZ8aa+3lwVgwA0GA09vEbAC7gdEgDZ7FY1MzfQ838PdSnTfBF95WUV+lIzn8LWNrpQh3OKdSuo3nakJp70baers5qdaGA/ejP6BBv3hsMAAAAsBH+pW3HPN2c1SnMT53C/C76vGEYOllQqrTsIh3OKVRadqEO5xQpLbtQy1JOXLKfMH+P/yliPmrd1FvN/DxYMREAAAC4AUwpdDDF5ZU6fLqouoBd+PNITpFKKi6+VqxtUx+NTgjX3d1aqKmvh0mJAQCNEeM3AEdB4YKk8yslZhWU6vAP0xO/zyrQ59+dVF5xhZydLLqlXYjGdA/XbR1C5ebiZHZcAICdY/wG4CgoXLiissoqff19thZuP6b/HDwtqyE18XJVYnwLjekers5h/mZHBADYKcZvAI6CwoUaOVVQqk93HdfC7ceUdrpIktSpuZ9GJ4Trrq4tFOjtZnJCAIA9YfwG4CgoXKgVwzC061ieFm7P1PKUEzpXVilXZ4tu7xCq0QnhuqV9iFycmXIIALg6xm8AjoLChetWWlGlL747qYXbM7UhLUeGIQX7uGtUtxYakxCutqG+ZkcEADRQjN8AHAWFC3XieF6JFu/I1KKdmcrILZYkxYX7a3T3CI2MDZO/l6vJCQEADQnjNwBHQeFCnTIMQ1uPnNGiHZlasSdLxeVVcnNx0uDOzTQ6IVz92gTL2Yn39gIAR8f4DcBRULhgM0VllVq5J0uLdmRqy5EzkqTm/h4a1a2FRidEqFWwt8kJAQBmYfwG4CgoXKgXGblF+mRHpj7ZeVzH80okSd1bNtGY7uEaHhsmH3cXkxMCAOoT4zcAR0HhQr2yWg1tOpyrhduPadXekyqrtMrT1VlDY5ppdPdw9WoVJCemHAJAo8f4DcBRULhgmoLSCq3YnaWF249p59E8SVJEoKfu6Raue7qFKyLQy+SEAABbYfwG4CgoXGgQUrMLtWhHphbvzFT2uTJJUlSQl6KCvdUq2FvRwd5qFeyjViHeau7nwVkwALBzjN8AHAWFCw1KZZVV61Jz9Nmu4/o+q0DpucUqr7RetI27i5NaBXsrKshbrUJ+XMi8FejtJouFMgYADR3jNwBHQeFCg1ZlNXQir0RHcoqqb4dzipSeU6TMs8Wy/s9Pr5+Hi1qF+Cj6R4UsOthbUcHeLMwBAA0I4zcAR8G/QNGgOTtZFBHopYhALw1oF3LRfWWVVTp2pliHT19axlKO5V2yr6a+7ufPhoX8UMZ++HtEoJfcXZzr6ykBAADAgVC4YLfcXZzVpqmv2jT1veS+wrJKpf9QwI6cLtKRnEIdySnSvqyC6vcEu8DJIoU38VKrH6YlXrh1aeGvJt5u9fV0AAAA0AgxpRAOxTAMnSkqrz4bdqS6kBXpSG7RRdeLWSxSx2Z+6tsmSH3aBKtnVKC8mZYIAHWC8RuAo6BwAT+wWg1lFZTqyOkipZ0u1Lb0M9qUlqvconJJkouTRV0jA9S7dbD6tg5S18gmcnNxMjk1ANgnxm8AjoLCBVyF1WrowKlz2pCao41pudpyOFdF5VWSJE9XZ/VoFai+rYPUt02wOjX3Y7l6AKghxm8AjoLCBdRCRZVVuzPztTE1RxvScrQzI0/lVeenIQZ4uapXq6DqKYjRwd4sUQ8AV8D4DcBRULiAG1BSXqXtGWe0MS1XG1NztOd4fvVS9c38PNSnTZD6tA5W3zZBau7vaW5YAGhAGL8BOAoKF1CH8ksqtPnw+fK1MS1Xh7ILq++LDvZWnzZB6ts6WL2ig1gBEYBDY/wG4CgoXIANZReUamNabvU1YMfzSiSdXwGxU3M/9W0TrD6tg9SzVaC83FgBEYDjYPwG4CgoXEA9MQxDR88Ua0Nqrjak5WhTWq7O/LACoquzRfERAT9MPwxWfEQAKyACaNQYvwE4CgoXYBKr1dD+k+e0Me3SFRC93Jx1c7sQjU4I183tQuTiTPkC0LgwfgNwFBQuoIE4vwJinjak5mr9oRxtyzgjw5CCfdw1qlsLjU4IV7tQX7NjAkCdYPwG4CgoXEADlXm2WJ/uPK5FOzOVkVssSYoL99fohHCNjGshfy9XkxMCwPVj/AbgKChcQANnGIa2pZ/Vwu3HtGJPlorLq+Tm4qRBnUI1JiFc/duGyJk3XAZgZxi/ATgKChdgR4rKKrVq70kt2nFMmw+fkSSF+rlrVLdwjU4IV+sQH5MTAkDNMH4DcBQULsBOHTtTrEU7MrVoR2b1cvPdIgM0OiFCI+Kay8+DKYcAGi7GbwCOgsIF2Dmr1dDmI7latD1TK/dmqbTCKncXJw2JaaYxCRHq3TqIKYcAGhzGbwCOgsIFNCLnSiu0as9JLdxxTNvSz0qSwvw9qqccRgV7m5wQAM5j/AbgKChcQCOVnlOkRTsy9cnOTGXll0qSekQ10ZiECA2LbS4fdxeTEwJwZIzfABwFhQto5Kqshjam5WjRjkx9vvekyiqt8nR11tAuzTQ6IVy9WgXJiSmHAOoZ4zcAR0HhAhxIQWmFlqdkadGOY9p5NE+SFN7EU/f8MOUwItDL5IQAHAXjNwBHQeECHFRqdqE+2ZmpxTszdaqgTJLUKzpQoxMiNKxLM3m5MeUQgO0wfgNwFBQuwMFVWQ2tO3RaC3dk6qvvTqm8yipvN2eNjA/TT/u2UrtQX7MjAmiEGL8BOAoKF4BqecXlWrY7Swu3H9PuzHxJUv+2wZrYt5VubhfCtV4A6gzjNwBHQeECcFk7j57Vv9Yf0aq9J1VlNRQd7K0JfaN0T7dwebPCIYAbxPgNwFFQuABc1Ym8En24OUNztxxVfkmFfD1cNK5HhB7sHcUiGwCuG+M3AEdB4QJQIyXlVVq8K1OzNqQrNbtQThZpcOdmmtivlbq3bCKLhemGAGqO8RuAo7Bp4SotLdW4ceO0b98+eXp6qmnTpnr77bfVpk2bi7bbs2ePnnzySWVnZ8vFxUU9e/bUW2+9JU9Pz2s+Bi/YQP0yDEPrDuXoXxuOaO2B05KkmBZ+mti3lYbHNpe7i7PJCQHYA8ZvAI7C5oVrzZo1Gjp0qCwWi2bOnKlFixZp7dq1F2136NAhlZSUKDY2VlVVVbrvvvvUsWNH/f73v7/mY/CCDZgnNbtQ/96YrkU7MlVSUaUQX3c9cFNL3d8rUsE+7mbHA9CAMX4DcBT1OqVw+/btGj16tNLT06+63fTp07V3717Nnj37mvvkBRswX35xheZvO6oPNmXoeF6J3JydlPjDsvKdwvzMjgegAWL8BuAo6rVwjR8/XoGBgZoxY8YVtykqKlJCQoJeffVV3X333Zfcn5SUpKSkpOqPCwsLlZeXZ5O8AGqnssqqL/ed0r/WH9H2jLOSzr+Z8sS+rXR7x1A5s6w8gB9QuAA4inorXK+88oqWLVumr7/+Wl5el1/ZrLy8XKNGjVJ0dLTeeOONGu2XF2ygYdqdmadZG9K1LOWEKq2GIgO99FCfKI3tHi5fD1ez4wEwGeM3AEdRL4Vr+vTpmj9/vlavXq2AgIDLblNRUaGxY8cqODhY7733Xo1XPOMFG2jYThWU6qPNGZqz5ajOFJXLx91FY7qHa0KfKLUM8jY7HgCTMH4DcBQ2L1xJSUmaM2eOVq9erSZNmlx2m8rKSt17770KCAjQ+++/X6vlpXnBBuxDaUWVliaf0L82HNH+k+dksUi3dwjVxH5R6h0dxLLygINh/AbgKGxauDIzMxUREaHo6Gj5+vpKktzd3bVlyxZNmzZNYWFhmjRpkubMmaMHHnhAsbGx1f/o6tu3r956661rPgYv2IB9MQxDm9Jy9a8NR/T1/mwZhtShma8m9m2lkfFh8nBlWXnAETB+A3AUvPExANOk5xRp9sZ0Ldx+TEXlVQrydtP9N0XqgV4t1dTPw+x4AGyI8RuAo6BwATBdQWmFFm7P1OyNR3TsTIlcnS26r2ekfjWsI2e8gEaK8RuAo6BwAWgwqqyGVn9/Su/8J027juYppoWf3r4/QRGBl1/ZFID9YvwG4CiczA4AABc4O1k0uHMzLZrUR5Nvb6u9xws0/I11WrP/lNnRAAAArguFC0CD4+xk0dRB7TTrpz3k5GTRxNnb9dcv9qvKatcn5AEAgAOicAFosG5t31TLnuqn2HB/vfVNmsb/c4tyCsvMjgUAAFBjFC4ADVpEoJcWTuqt+2+K1Ma0XA1/Y522p58xOxYAAECNULgANHjuLs76491d9Nq9ccovqdC49zbrn+uPyM7X/AEAAA6AwgXAbtzdNVxLnuynyEAv/WH5Pj01d5cKyyrNjgUAAHBFFC4AdqV9M18teaqvhsY004o9WRo5c70OnjpndiwAAIDLonABsDu+Hq76+/3d9JvhHZWRW6zEmRv02a7jZscCAAC4BIULgF1AMqy6AAAgAElEQVSyWCx6pH+05j/WS74eLpqyIFm//WyvyiqrzI4GAABQjcIFwK71iArUisn91Ts6SB9uztDYdzfreF6J2bEAAAAkUbgANAIhvu768OGeeuKW1ko5lqcRb6zTfw6eNjsWAAAAhQtA4+Di7KQXhnTQ+w92V6XV0IRZW/XaVwdVZWXpeAAAYB4KF4BGZWCnUK14ur86NffTjK8PacKsrTpTVG52LAAA4KAoXAAancggL33ysz66t3uE1h3K0Yg31in5WJ7ZsQAAgAOicAFolDxcnfXn0bH6y+hY5RaVa8w7G/XhpnQZBlMMAQBA/aFwAWjUxnaP0OIn+qi5v6d+u+Q7TVmQrOLySrNjAQAAB0HhAtDodQ7z17Kn+2lgx1AtST6hxJkblJpdaHYsAADgAChcAByCv6er/vFggn45tIPSThcqceZ6rdidZXYsAADQyFG4ADgMi8WiSTe31pxHesnTzUVPzt2p/1u2TxVVVrOjAQCARorCBcDh9G4dpJWT+6lnVKD+teGIxr23WVn5JWbHAgAAjRCFC4BDaurnoTmP3qTHBkRrR8ZZjXhjvTak5pgdCwAANDIULgAOy9XZSS8O66h3Huimskqrxv9zi2auOSSrlaXjAQBA3aBwAXB4Q2Kaa+lTfdUu1FfTvzyoRz7YrvziCrNjAQCARoDCBQCSokN89OkTfTWqWwut2Z+toTO+1UamGAIAgBtE4QKAH3i6OetvY+L0p1FdlFdSofve36KXln2n0ooqs6MBAAA7ReECgB+xWCwa1zNSq57pr+4tm2jWhnQNf2OdUo7lmR0NAADYIQoXAFxGyyBvLXi8t345tIOOnSnRqLc3Kumrg7xnFwAAqBUKFwBcgbPT+TdKXvr0+QU13vj6kO7++wYdOnXO7GgAAMBOULgA4Bo6NPPTkif76slbW2vfiQINf3O93l93mOXjAQDANVG4AKAG3Fyc9PPBHbRwUh+1CPDUyyu+10/+sVnHzhSbHQ0AADRgFC4AqIWElk20YnI/PdS7pbYcOaMhr3+rBduOyjA42wUAAC5F4QKAWvJyc9FLiTH68OGe8vVw1S8+2aNH/r1d2edKzY4GAAAaGAoXAFyn/m1D9MWzAzSqawt9vT9bg1/7Viv3ZJkdCwAANCAULgC4Af6erkq6N15v399NkvTEnJ2aMn+X8osrTE4GAAAaAgoXANSBoV2a68tnb9bAjk31WfIJDX79W3178LTZsQAAgMkoXABQR0J83fWPB7vrL6NjVVhWqQf/tVW/+WyPissrzY4GAABMQuECgDpksVg0tnuEVj3TXze1CtRHm49q2Ix12pFxxuxoAADABBQuALCBiEAvzXu0l347opNO5JdqzDub9JfP96usssrsaAAAoB5RuADARpycLHq4XyuteLqfOof56+9r05Q4c4O+zyowOxoAAKgnFC4AsLG2ob5a/EQfTRnYVoeyCzVy5nq9vTZNVVbeLBkAgMaOwgUA9cDV2UlTBrbTp0/0UWSgl/78+X6NfXeT0nOKzI4GAABsiMIFAPUoNjxAKyb318P9WmlHxlkNnbFOH23OkGFwtgsAgMaIwgUA9czD1Vm/HdFJ8x7tpUBvN/3ms716aNY2ncwvNTsaAACoYxQuADBJ79ZB+nxKf43tHq5vD57WHa/9R0uSj3O2CwCARsSmhau0tFR33XWX2rVrp7i4OA0aNEipqamXbFdYWKjBgwcrODhYAQEBtowEAA2Kr4er/jI6Tu8/2F1uLk56Zn6ynpq3S2eLys2OBgAA6oDNz3A99thjOnDggFJSUpSYmKhHHnnkkm1cXV31i1/8QqtXr7Z1HABokAZ2CtUXUwZoSOdmWrE7S3f9fYPySyrMjgUAAG6QTQuXh4eHhg0bJovFIknq1auX0tPTL9nO3d1dt912G2e3ADi0IB93vf1AN/16WEdl5Bbrl5/sZnohAAB2rl6v4ZoxY4YSExNvaB9JSUkKDw+vvhUWFtZROgAwn8Vi0SP9W2l4bHOt2ntSH23OMDsSAAC4AfVWuF555RWlpqbq1VdfvaH9TJ06VZmZmdU3Hx+fOkoIAA2DxWLRq6O6KDLQS39Y/r32Hs83OxIAALhO9VK4pk+frsWLF2vVqlXy8vKqj4cEALvm5+Gqmfd1lSFDT83dqcKySrMjAQCA62DzwpWUlKR58+bpq6++4hotAKiF2PAAvTiso9Jzi/Xi4j1czwUAgB2yaeHKzMzUc889p7y8PN16662Kj4/XTTfdJEmaNm2a3nnnneptY2Nj1bt3bxUUFCg8PFzjx4+3ZTQAsAsT+kRpUKdQLU05oQXbjpkdBwAA1JLFsPP/Mg0PD1dmZqbZMQDAZvKKyzX8jfXKKSzT0qf6qX0zX7MjATeM8RuAo6jXVQoBALUX4OWmN37SVVVWQ0/O3anicq7nAgDAXlC4AMAOJLRsop8Pbq/U7EJNW/Kd2XEAAEANUbgAwE482j9at7QP0aIdmfpkB1OxAACwBxQuALATTk4W/W1MnEL93PXbJXuVms0bvwMA0NBRuADAjgT5uOuNcV1VWlGlp+buVGlFldmRAADAVVC4AMDO3BQdpGcHttP+k+f0f8v3mR0HAABcBYULAOzQE7e2Ud82QZq75aiWpZwwOw4AALgCChcA2CFnJ4teuzdewT7u+tXiPcrILTI7EgAAuAwKFwDYqaa+Hnr93ngVlVfqybk7VVbJ9VwAADQ0FC4AsGP92gbrqVvbaO/xAr26cr/ZcQAAwP+gcAGAnXvm9rbqGRWo2RvT9cV3J82OAwAAfoTCBQB2zsXZSW/8pKuaeLnq5wtTdOxMsdmRAADADyhcANAINPP3UNLYeBWUVurpebtUUWU1OxIAABCFCwAajVs7NNXjA6KVfCxP0784YHYcAAAgChcANCrPD26vrpEBevfbw1qz/5TZcQAAcHgULgBoRFydnfTmT7rKz8NFz32coqz8ErMjAQDg0ChcANDIhDfx0l/HxOlscYWemZesSq7nAgDANBQuAGiEBndupp/2jdLW9DN6ffUhs+MAAOCwKFwA0Ej9cmgHdWnhr7fWpmrdodNmxwEAwCFRuACgkXJ3cdbM+7rKx81Fzy5IVva5UrMjAQDgcChcANCItQzy1qv3dFFOYbmmzE9WldUwOxIAAA6FwgUAjdyI2DDdf1OkNqbl6q1vUs2OAwCAQ6FwAYAD+O2ITurQzFevrz6ozYdzzY4DAIDDoHABgAPwcHXWW/d3k4ers56Zv0u5hWVmRwIAwCFQuADAQbQO8dEf747RqYIyTf04RVau5wIAwOYoXADgQO7uGq4xCeH6z8HTem/dYbPjAADQ6FG4AMDBvJTYWW2b+uivXxzQjowzZscBAKBRo3ABgIPxcnPRW/d3k6uzRZPnJSuvuNzsSAAANFoULgBwQO1CffXSyM46nlei5xfulmFwPRcAALZA4QIABzW2e4QS48O0+vtTmrUh3ew4AAA0ShQuAHBQFotFf7y7i1oFe+vVVd8r5Vie2ZEAAGh0KFwA4MB83F00876uslgsemreThWUVpgdCQCARoXCBQAOrnOYv347vKOOnSnRLz/hei4AAOoShQsAoAd6tdSwLs20cs9JfbTlqNlxAABoNChcAABZLBa9OipWEYGe+sPyffruRL7ZkQAAaBQoXAAASZK/p6tm/qSbDMPQzz7aqRN5JWZHAgDA7lG4AADV4iIC9IfEGB09U6wx72xSRm6R2ZEAALBrFC4AwEXG9YzUX0fHKiu/RGPf3aTU7HNmRwIAwG5RuAAAlxjTPUJv/KSrcgvLNfbdzVzTBQDAdaJwAQAua0RsmN4dn6DCskr95L3N2nn0rNmRAACwOxQuAMAV3d4xVLMm9FBFlaEH3t+iTWm5ZkcCAMCuULgAAFfVt02wPny4p5wtFk2YtVXfHMg2OxIAAHaDwgUAuKbuUYGa+2gvebo567EPtuvzvVlmRwIAwC5QuAAANdIl3F8LHustf083PTl3lz7dlWl2JAAAGjwKFwCgxto389XCSb0V6uuuqR+naM6WDLMjAQDQoFG4AAC10irYWx9P6q3IQC/9+tO9en/dYbMjAQDQYNm0cJWWluquu+5Su3btFBcXp0GDBik1NfWy2y5fvlwdOnRQ27ZtNWrUKBUUFNgyGgDgBoQ38dLCx3urbVMfvbzie73x9SEZhmF2LAAAGhybn+F67LHHdODAAaWkpCgxMVGPPPLIJdsUFhbq4Ycf1meffaZDhw4pLCxMf/jDH2wdDQBwA5r6eWjB473VOcxPSV8d1J8+30/pAgDgf9i0cHl4eGjYsGGyWCySpF69eik9Pf2S7VatWqWuXbuqQ4cOkqQnnnhC8+bNs2U0AEAdCPR209xHe6lbZIDe/c9h/W7pd7JaKV0AAFxQr9dwzZgxQ4mJiZd8/ujRo2rZsmX1x1FRUcrKylJlZeUl2yYlJSk8PLz6VlhYaNPMAICr8/d01YcP36Te0UH6YFOGXvhktyqrrGbHAgCgQai3wvXKK68oNTVVr7766g3tZ+rUqcrMzKy++fj41FFCAMD18nZ30ayf9tBtHZpq0Y5MPTM/WeWVlC4AAOqlcE2fPl2LFy/WqlWr5OXldcn9kZGRysj479LC6enpat68uVxcXOojHgCgDni4OuudBxI0rEszrdiTpZ99tEOlFVVmxwIAwFQ2L1xJSUmaN2+evvrqKwUEBFx2myFDhmjnzp3av3+/JOnvf/+7xo0bZ+toAIA65ubipDfGddWobi309f5sPfzvbSoqu3R6OAAAjsJi2HBJqczMTEVERCg6Olq+vr6SJHd3d23ZskXTpk1TWFiYJk2aJElaunSpXnjhBVVWViomJkb//ve/5e/vf83HCA8PV2Zmpq2eAgDgOlithqYt3auPNh9VQssmmvXTHvLzcDU7FhoQxm8AjsKmhas+8IINAA2TYRh6ddV+vfftYcW08NMHE29SoLeb2bHQQDB+A3AU9bpKIQDAcVgsFv1qaAdNGdhWe48XaNx7m5R9rtTsWAAA1CsKFwDAZiwWi6YMbKdfD+uog6cKNfadTTqeV2J2LAAA6g2FCwBgc48OiNbLd8UoPbdYY9/ZpPScIrMjAQBQLyhcAIB68UCvlvrbmDhl5ZdozLubdPDUObMjAQBgcxQuAEC9uSchXDPv66a84nLd++4m7T2eb3YkAABsisIFAKhXw7o013vju6uovEo/eW+zdmScMTsSAAA2Q+ECANS7Wzs01ewJPVRlGBr/z63amJpjdiQAAGyCwgUAMEWfNsH68OGb5Oxk0YTZ27Rm/ymzIwEAUOcoXAAA0yS0bKJ5j/aSj7uLHvtgh1bszjI7EgAAdYrCBQAwVUwLfy14rJcCvd309Lyd+mRHptmRAACoMxQuAIDp2ob66uPHe6u5v6eeW5iijzZnmB0JAIA6QeECADQIUcHe+nhSb7UK9tZvl+zVxjQW0gAA2D8KFwCgwWgR4KlZE3rI09VZUxek6GxRudmRAAC4IRQuAECDEhXsrZdGdtbJglL9cvFuGYZhdiQAAK4bhQsA0OCMTgjXiNjm+uK7U5q39ZjZcQAAuG4ULgBAg2OxWPTHu7uoRYCn/m/5d0rNLjQ7EgAA14XCBQBokPw9XfX6uHiVV1o1ed4ulVVWmR0JAIBao3ABABqsHlGBeuq2ttqXVaC/fn7A7DgAANQahQsA0KBNvq2NukUG6P31R/TtwdNmxwEAoFYoXACABs3F2UkzxnWVr7uLpn6copzCMrMjAQBQYxQuAECDFxHopZfvjlFOYZleWMRS8QAA+0HhAgDYhcT4FhrVtYXW7M/WB5syzI4DAECNULgAAHbjpcTOigz00h9Xfq/9JwvMjgMAwDVRuAAAdsPXw1UzxsWrymromXnJKq1gqXgAQMNG4QIA2JWukU00dVA7HTh1Tq+u/N7sOAAAXBWFCwBgdybd3Fo3tQrUvzdl6OvvT5kdBwCAK6JwAQDsjrOTRa/dGy8/Dxf9fNFuZReUmh0JAIDLonABAOxSWICn/nRPrM4Uleu5hSmyWlkqHgDQ8FC4AAB2a1iX5hrXI0LrDuXoXxuOmB0HAIBLULgAAHZt2p2dFB3srT9/vl97j+ebHQcAgItQuAAAds3LzUUzxnWVJE2ev0vF5ZUmJwIA4L8oXAAAu9cl3F8/H9xeh08X6Q/L95kdBwCAahQuAECj8Ei/aPVrE6x5W4/p871ZZscBAEAShQsA0Eg4OVn0t7FxauLlql98skdZ+SVmRwIAgMIFAGg8Qv089JfRccovqdCzC5JVxVLxAACT1bhwTZs2TXl5eTIMQ8OHD1dwcLA++eQTW2YDAKDWBnUK1fheLbX58Bm98580s+MAABxcjQvXkiVLFBAQoNWrV8vFxUUbNmzQyy+/bMtsAABcl18P76i2TX302lcHlXwsz+w4AAAHVuPC5eR0ftP//Oc/GjNmjNq3by+LxWKzYAAAXC8PV2e98ZOucnKy6Jn5u1RYxlLxAABz1LhweXt7689//rPmz5+vQYMGyTAMlZeX2zIbAADXrWNzP/1qaAdl5Bbrd0u+MzsOAMBB1bhwzZ49W1lZWfrLX/6i0NBQpaWl6YEHHrBlNgAAbsiEPlG6pX2IPtmZqaUpJ8yOAwBwQBbDMGq9hFN+fr6OHTummJgYW2SqlfDwcGVmZpodAwDQQOUUlmnI6+tUVlmllZP7KyLQy+xIEOM3AMdR4zNcQ4YMUV5engoLCxUXF6cRI0Zo2rRptswGAMANC/Zx1/QxsTpXWqlnFySrsspqdiQAgAOpceE6deqUAgICtHLlSiUmJurQoUP69NNPbZkNAIA6cUv7pprYt5W2Z5zVzG9SzY4DAHAgNS5cFRUVkqRvv/1WgwYNkqurq1xcXGwWDACAuvSLoe3Vsbmf3vj6kLannzE7DgDAQdS4cMXExGjo0KFavny5brvtNhUXF9syFwAAdcrdxVlvjIuXm4uTnpmfrILSCrMjAQAcQK1WKXz88cf1zTffyMvLS2fPntWrr75qy2wAANSptqG++s3wTjqeV6Jff7pX17FuFAAAtVLjwuXh4aGEhARt2rRJc+fOlWEYGjJkyDW/bvLkyYqKipLFYlFycvJlt7FarXr++ecVExOjDh066OGHH+Y9vgAANnH/TZEa1ClUy1JOaPHO42bHAQA0cjUuXEuWLFHXrl21cOFCLVy4UN26ddOyZcuu+XWjR4/W+vXr1bJlyytu889//lM7d+7Uzp079f3338vJyUkzZsyoaTQAAGrMYrHoz/fEKtTPXdOW7FV6TpHZkQAAjViNC9dLL72kzZs369NPP9Wnn36qjRs36ne/+901v27AgAEKDw+/6jYpKSkaOHCg3NzcZLFYNHToUH344Yc1jQYAQK0EerspaWy8iiuq9MyCZFWwVDwAwEZqXLiqqqrUpk2b6o/btGkjq7VuBqiEhAQtXbpUBQUFqqio0Mcff6z09PQ62TcAAJfTt02wHhsQrZRjeXp99UGz4wAAGqkaF66mTZvq/fffl9VqldVq1T//+U+FhITUSYgJEyZoyJAhuvnmm3XzzTerXbt2V1xyPikpSeHh4dW3wsLCOskAAHA8zw1qry4t/PX3tWnalJZrdhwAQCNkMWq4RFNaWpruv/9+7dq1S5LUrVs3/e1vf1OfPn1q9EBRUVH67LPPFB8ff81t58+fr7feekvr1q275rbh4eHKzMysUQYAAP7X4dOFGvHmevl5uOrzKf0V4OVmdiSHwPgNwFHU+AxX69attXnzZuXm5io3N1ebNm3SuHHj6iREaWmpzp49K0nKycnRn/70J73wwgt1sm8AAK4mOsRHvx/ZWScLSvXLT/awVDwAoE7VuHBd4OPjIx8fH0mq0aD0+OOPV/8v1uDBg6uvA3vkkUe0dOlSSVJ+fr769Omjzp07q3///po0aZLuvPPO2kYDAOC6jEkI1/AuzfX5dye1YNsxs+MAABqRGk8pvJzIyEgdPXq0LvPUGlMSAAB1Ib+4QkNnfKuzxRVa9nQ/tWnqY3akRo3xG4CjuPzKFD+ye/fuK95XUVFRp2EAADCLv5erXh/XVePe26QpC3Zp6ZP95ORkMTsWAMDOXbNwJSYmXvE+T0/POg0DAICZerYK1EN9ojRrQ7q2pp9Rr+ggsyMBAOzcNQvXkSNH6iMHAAANwr09IjRrQ7qWppygcAEAblitF80AAKAx69DMT+1CfbRqT5YqqqxmxwEA2DkKFwAA/2NkXJjOFldo/aEcs6MAAOwchQsAgP9xZ1yYJGlpygmTkwAA7B2FCwCA/9EyyFtxEQH68ruTKimvMjsOAMCOUbgAALiMkXFhKiqv0pr92WZHAQDYMQoXAACXMSK2uSwWaWnKcbOjAADsGIULAIDLCPXzUK9WQfrmwGkVlFaYHQcAYKcoXAAAXMHI+DCVV1r1xd6TZkcBANgpChcAAFcwNKaZXJ0trFYIALhuFC4AAK4gwMtNA9qGaGNarnIKy8yOAwCwQxQuAACuYmR8mKqshlbuyTI7CgDADlG4AAC4ioEdQ+Xh6qSlyUwrBADUHoULAICr8HZ30cCOodqecVaZZ4vNjgMAsDMULgAAriExvoUkaVkK0woBALVD4QIA4BoGtAuWn4cLqxUCAGqNwgUAwDW4uzhraExzfZ9VoNTsc2bHAQDYEQoXAAA1MDI+TJJYPAMAUCsULgAAaqBXdJBCfN21NOWEDMMwOw4AwE5QuAAAqAFnJ4uGd2mu9Nxi7Tmeb3YcAICdoHABAFBDTCsEANQWhQsAgBrqGhGgiEBPLd+dJauVaYUAgGujcAEAUEMWi0V3xobpZEGptqafMTsOAMAOULgAAKiF6mmFvCcXAKAGKFwAANRCh2Z+ahfqo1V7slRRZTU7DgCggaNwAQBQSyPjwnS2uELrD+WYHQUA0MBRuAAAqKU745hWCACoGQoXAAC11DLIW3ERAfryu5MqKa8yOw4AoAGjcAEAcB0S48JUVF6lNfuzzY4CAGjAKFwAAFyHEbHN5WSRlqYcNzsKAKABo3ABAHAdmvp5qFd0kL7Zf1r5JRVmxwEANFAULgAArtPIuDCVV1n1xXcnzY4CAGigKFwAAFynoTHN5eps0TJWKwQAXAGFCwCA6+Tv5aqb24VoQ2qOTp8rMzsOAKABonABAHAD7owLk9WQVu7JMjsKAKABonABAHADBnUKlaerM2+CDAC4LAoXAAA3wMvNRQM7hWpHxlllni02Ow4AoIGhcAEAcINGxoVJkpalMK0QAHAxChcAADdoQLtg+Xm4MK0QAHAJChcAADfI3cVZQ2Oa6/usAqVmnzM7DgCgAaFwAQBQB0bGn59WuDSZs1wAgP+icAEAUAd6RQcpxNddS1NOyDAMs+MAABoIChcAAHXA2cmiEbHNlZ5brD3H882OAwBoIChcAADUkQurFTKtEABwAYULAIA6Eh8RoMhALy3bfUJVVqYVAgDqoXBNnjxZUVFRslgsSk5Ovuw2VqtVU6dOVadOnRQbG6tbb71Vqampto4GAECdslgsujOuuU4VlGnrkTNmxwEANAA2L1yjR4/W+vXr1bJlyytus3TpUm3YsEEpKSnavXu3br/9dr344ou2jgYAQJ0bGddCknhPLgCApHooXAMGDFB4ePhVt7FYLCorK1NpaakMw1BBQcE1vwYAgIaofTNftQ/11aq9WSqvtJodBwBgsgZxDdedd96pW265Rc2aNVPz5s319ddf6//+7/8uu21SUpLCw8Orb4WFhfWcFgCAqxsZH6a84gqtTz1tdhQAgMkaROHavn279u7dq+PHj+vEiRO6/fbbNWnSpMtuO3XqVGVmZlbffHx86jktAABXd2csqxUCAM5rEIXrgw8+0G233aaAgAA5OTnpoYce0jfffGN2LAAArktkkJfiIwL05b5TKimvMjsOAMBEDaJwRUdHa82aNSovL5ckLV++XDEx/9/evQdXdRb8Hv+tXAiXhEDIDpdskgDJ5oUGEi5pA4UdmBLpy9s37VSkeqidVhmwlUHEjo7OUevYw/hHRT11LHqGwapFa6XyYovTA1S5vFKaWjYpbbkk5ZYSLuESEgK57ef8QbMPRYgVsvbaa63vZ2bPNFkrye/pWnslP/aznl3scCoAAG5dVckItbZ3aev+U05HAQA4yPbCtWTJEgWDQdXX12vu3LkqLCyUJC1atEgbN26UJH35y1/WqFGjVFJSookTJ2rr1q167rnn7I4GAIBt7ps4XEkW0woBwO8sY4yr35mxu8wBAJBo/sf/eUNvHTmv6v85R5n9Up2Ok1D4/Q3ALxJiSiEAAF5UVTJC7V1RvfbuSaejAAAcQuECAMAm/148XKnJlv7EmyADgG9RuAAAsElm/1RVhHL037WNOtPc5nQcAIADKFwAANioqnSEokba9E6D01EAAA6gcAEAYKM543LULzVZG5lWCAC+ROECAMBG/fukqHL8UP396HnVn291Og4AIM4oXAAA2KyqZIQk6U97mVYIAH5D4QIAwGbhUECZ/VL1X5EPnY4CAIgzChcAADbrk5Kkfy8epv0nm3XoVLPTcQAAcUThAgAgDrqnFbJ4BgD4C4ULAIA4uGv0EOVkpGnj3hMyxjgdBwAQJxQuAADiIDnJ0n9MHK6jZ1tVU9/kdBwAQJxQuAAAiBOmFQKA/1C4AACIk9KRg5SX1V+v1JxQV5RphQDgBxQuAADixLIs/WfJcJ262KY3D59zOg4AIA4oXAAAxFFVSa4kphUCgF9QuAAAiKOxwzI0dmiG/ryvQe2dUafjAABsRuECACDOqkpH6EJrh3bWnnE6CgDAZhQuAADiLLZaYYRphQDgdRQuAADibGRWf03KG6T/+94pXW7vcjoOAMBGFC4AABxQVTJCre1d2rr/lNNRAAA2onABAOCA/5g4XEkW0woBwOsoXAAAOCAno6+mjRmivx44o6bLHQNsR8IAABdvSURBVE7HAQDYhMIFAIBDqkpGqL0rqtf2nXQ6CgDAJhQuAAAccu8dw5WabPEmyADgYRQuAAAcktk/VRWhHP2trlGnm684HQcAYAMKFwAADqoqHaGokTbVNDgdBQBgAwoXAAAOmjMuR/1Sk5lWCAAeReECAMBB/fukqHL8UL197IKOn2t1Og4AoJdRuAAAcFhVyQhJ0p9qeJULALyGwgUAgMPCoYAy+6XyJsgA4EEULgAAHNYnJUnzJgzT/pPNOnSq2ek4AIBeROECACAB/OdH0wpZPAMAvIXCBQBAArhr1BDlZKRp494TMsY4HQcA0EsoXAAAJIDkJEv3TRyho2dbVVPf5HQcAEAvoXABAJAgqkqZVggAXkPhAgAgQZQEM5U/pL9eqTmhrijTCgHACyhcAAAkCMuyNG/CcJ262KZ3PmRaIQB4AYULAIAEMntsjiRp24EzDicBAPQGChcAAAlkUt4gpaelaPshChcAeAGFCwCABJKanKS7C4doz7HzamrtcDoOAOA2UbgAAEgw4VBAUSPtrG10OgoA4DZRuAAASDDhooAkaftBphUCgNtRuAAASDAjs/prdGCAth08I2NYHh4A3IzCBQBAAqoIBXTy4hUdOt3idBQAwG2gcAEAkIAqQlenFbI8PAC4m+2Fa9myZSooKJBlWYpEIjfcZ+3atSotLY09srOz9eCDD9odDQCAhHXXqCHqk5LE8vAA4HK2F6758+dr586dys/Pv+k+jz32mCKRSOwxbNgwLVy40O5oAAAkrH59knXXqCzt/uCcWts7nY4DALhFtheucDisYDD4ifffvXu3Tp8+raqqKhtTAQCQ+CpCAbV3RbX7g3NORwEA3KKEu4drzZo1+vznP6/U1NQbbl+1apWCwWDs0dLCzcQAAG+K3cfF8vAA4FoJVbguXbqk3/3ud/riF794031WrFih+vr62CM9PT2OCQEAiJ/CnHSNyOzL+3EBgIslVOF66aWXdMcdd2j8+PFORwEAwHGWZSkcCuiDxks6fq7V6TgAgFuQUIVrzZo1Pb66BQCA3zCtEADczfbCtWTJEgWDQdXX12vu3LkqLCyUJC1atEgbN26M7XfgwAFFIhE99NBDdkcCAMA1phdmKznJYlohALiUZYwxToe4Hd1lDgAAr5r/3N+0/2Sz3v52pfqkJNTklFvG728AfuGNqzYAAB4WDgXU0tapt4+ddzoKAOBfROECACDBdd/HxbRCAHAfChcAAAluQm6msgb0YeEMAHAhChcAAAkuKcnSjMJsvXvios40tzkdBwDwL6BwAQDgAt3TCncc4lUuAHATChcAAC4wM5Qtifu4AMBtKFwAALhATkZfjR8+UNsPNSoadfU7ugCAr1C4AABwiXAooHOX2rXvRJPTUQAAnxCFCwAAl2B5eABwHwoXAAAuMSV/sAb0SWZ5eABwEQoXAAAu0SclSdPGZOvtYxd08UqH03EAAJ8AhQsAABepGBtQV9Tob7WNTkcBAHwCFC4AAFykoujqfVxMKwQAd6BwAQDgInlD+mtU9gBtP9goY1geHgASHYULAACXCRdl68MLl1V3psXpKACAf4LCBQCAy1SM7Z5WyH1cAJDoKFwAALhM+egh6pOcxH1cAOACFC4AAFymf58UlY0arN0fnNWVji6n4wAAekDhAgDAhSpCAbV1RrX78DmnowAAekDhAgDAhcKhj+7jOsC0QgBIZBQuAABcaOzQDA0dmKbthyhcAJDIKFwAALiQZVmqCAVUe7pFH1647HQcAMBNULgAAHCp7mmF21mtEAASFoULAACXmlGYrSSL+7gAIJFRuAAAcKlB/fuoZOQg/Xdtozq6ok7HAQDcAIULAAAXqwgF1NzWqcjxC05HAQDcAIULAAAXY3l4AEhsFC4AAFysJDhImf1SWR4eABIUhQsAABdLTrI0syhb73zYpLMtbU7HAQBch8IFAIDLhUMBGSPtrG10OgoA4DoULgAAXK6C+7gAIGFRuAAAcLmhA/vq34ZlaPuhRkWjxuk4AIBrULgAAPCAilBAjS1teq/hotNRAADXoHABAOABseXhDzKtEAASCYULAAAPmFowWP1Sk7WdwgUACYXCBQCAB6SlJGv6mCH6+9Hzar7S4XQcAMBHKFwAAHhEOBRQZ9RoV91Zp6MAAD5C4QIAwCMquI8LABIOhQsAAI8oyB6gvKz+2nbwjIxheXgASAQULgAAPKQiFFD9+cs63HjJ6SgAAFG4AADwFJaHB4DEQuECAMBDpo0ZotRki+XhASBBULgAAPCQ9LQUTc3P0q4PzupKR5fTcQDA9yhcAAB4TDgU0JWOqN46ct7pKADgexQuAAA85v8vD3/a4SQAAAoXAAAeM254hgIZaSycAQAJwPbCtWzZMhUUFMiyLEUikZvu984772jWrFkaN26cxo0bp5dfftnuaAAAeJJlWQoXBXTwVIsami47HQcAfM32wjV//nzt3LlT+fn5N92ntbVV999/v55++mm9//772rdvn2bOnGl3NAAAPCscypYkVisEAIfZXrjC4bCCwWCP+6xbt07l5eWaMWOGJCk5OVmBQMDuaAAAeNbMooAsS9p+sNHpKADgawlxD9d7772ntLQ03XfffSotLdUjjzyiM2du/C9yq1atUjAYjD1aWlrinBYAgMSXNaCPJgYHacehM+rsijodBwB8KyEKV2dnp7Zs2aKf//zn2rNnj3Jzc/X444/fcN8VK1aovr4+9khPT49zWgAA3KGiKFsXr3Rqb32T01EAwLcSonDl5eVp9uzZys3NlWVZevjhh/XGG284HQsAAFerGNu9PDz3cQGAUxKicC1YsEDV1dW6ePGiJGnTpk0qKSlxOBUAAO5WEhykjL4pFC4AcJDthWvJkiUKBoOqr6/X3LlzVVhYKElatGiRNm7cKOnqK1zf+ta3NH36dE2cOFGvv/66Vq9ebXc0AAA8LSU5STOLslVTf0HnL7U7HQcAfMkyxhinQ9yO7jIHAAD+0YvVx/SN9e/of39ukqpKRjgdJ4bf3wD8IiGmFAIAAHuEQ1fv4+L9uADAGRQuAAA8bHhmP4WGpmv7wTNy+aQWAHAlChcAAB4XLgrodHOb9p9sdjoKAPgOhQsAAI9jeXgAcA6FCwAAjysryFLf1CRtO0DhAoB4o3ABAOBxfVOTVT56iN46ek6X2jqdjgMAvkLhAgDABypCAXV0Ge2qO+t0FADwFQoXAAA+EFse/hDTCgEgnihcAAD4wOjsAQoO7sfCGQAQZxQuAAB8wLIshUMBHT3bqiONl5yOAwC+QeECAMAnKphWCABxR+ECAMAnpo8ZopQki+XhASCOKFwAAPhERt9UTc4frF0fnFVbZ5fTcQDAFyhcAAD4SEUooNb2Lv39yHmnowCAL1C4AADwke77uLZxHxcAxAWFCwAAHxk/fKCy0/twHxcAxAmFCwAAH0lKsjSzKKD9J5t16uIVp+MAgOdRuAAA8JnY8vC8CTIA2I7CBQCAz8woypYkbaNwAYDtKFwAAPhMdnqaJuRmamdto7qixuk4AOBpFC4AAHyoIhTQhdYO1dRfcDoKAHgahQsAAB8Kx+7janQ4CQB4G4ULAAAfmpQ3SBlpKdp28LTTUQDA0yhcAAD4UGpykqYXDlHk+AU1tXY4HQcAPIvCBQCAT1WEchQ10s5aphUCgF0oXAAA+FQ41L08PNMKAcAuFC4AAHwqOLi/xgQGaPvBRhnD8vAAYAcKFwAAPlYRytHJi1d08FSL01EAwJMoXAAA+Fj3tMLtB884nAQAvInCBQCAj5WPHqK0lCRto3ABgC0oXAAA+Fjf1GTdOSpLbx4+p9b2TqfjAIDnULgAAPC5ilBA7V1R7f7gnNNRAMBzKFwAAPjcrLEBSWJaIQDYgMIFAIDPjQmka0RmXxbOAAAbULgAAPA5y7JUMTagDxov6fi5VqfjAICnULgAAIDCRf84rfBye5dTcQDAMyhcAABA0wuzlZxkxQrX/pMXVfa/tuiPe+odTgYA7kbhAgAAyuyXqkkjB+lvtY1q74zqF9s/UEtbp0ZnpzsdDQBcjcIFAAAkXV0e/lJ7l15954Q2Rk6ofHSWSkYOcjoWALgahQsAAEiSKj5aHv47G95VZ9RoSXiMw4kAwP0oXAAAQJJUPCJTWQP6qLmtU2OHZsTenwsAcOsoXAAAQJKUlGRpZlG2JGlxeLQsy3I4EQC4X4rTAQAAQOL48uxC5Wf1V1XpCKejAIAnULgAAEBMaGiGVnxqrNMxAMAzmFIIAAAAADahcAEAAACATWwvXMuWLVNBQYEsy1IkErnhPn/961/Vr18/lZaWxh6XL1+2OxoAAAAA2Mr2e7jmz5+vr3/965oxY0aP+40dO/amhQwAAAAA3Mj2whUOh+3+EQAAAACQkBLmHq66ujpNnjxZZWVl+tnPfnbT/VatWqVgMBh7tLS0xDElAAAAAHxyCbEs/OTJk1VfX6/MzEzV19dr3rx5ys7O1oIFC/5h3xUrVmjFihWxj4PBYDyjAgAAAMAnlhCvcA0cOFCZmZmSrhaoz33uc9qxY4fDqQAAAADg9iRE4WpoaFA0GpUkNTc365VXXtGkSZMcTgUAAAAAt8f2wrVkyRIFg0HV19dr7ty5KiwslCQtWrRIGzdulCStX79eEyZMUElJicrLy1VZWanHHnvM7mgAAAAAYCvLGGOcDnE7usscAABwD35/A/CLhJhSCAAAAABeROECAAAAAJtQuAAAAADAJhQuAAAAALAJhQsAAAAAbELhAgAAAACbULgAAAAAwCYULgAAAACwCYULAAAAAGxiGWOM0yFuR1pamgKBgC3fu6WlRenp6bZ8bzdg/Iyf8TN+v2L89o//zJkzamtrs/VnAEAicH3hslMwGFR9fb3TMRzD+Bk/42f8fsX4/T1+AOhNTCkEAAAAAJtQuAAAAADAJslPPfXUU06HSGTTpk1zOoKjGD/j9zPGz/j9zO/jB4Dewj1cAAAAAGATphQCAAAAgE0oXAAAAABgEwrXDRw6dEjTp09XKBRSWVmZ3n33Xacj9aorV67ogQceUCgUUklJiSorK1VbWytJOn36tO69914VFRWpuLhY27dvj31dT9vcau3atbIsSxs2bJDkn/G3tbVp6dKlKioq0oQJE/Twww9L6vnc99LzYtOmTZo8ebJKS0tVXFys559/XpJ3j/+yZctUUFAgy7IUiURin7/V4+22c+FG4+/pOih561y42fHvdv11UPLW+AHAcQb/YPbs2Wbt2rXGGGNeeuklM3XqVGcD9bLLly+bV1991USjUWOMMc8++6ypqKgwxhjz2GOPme9+97vGGGPefPNNk5uba9rb2//pNjc6fPiwmTZtmikvLzd//OMfjTH+Gf/y5cvN0qVLY+dAQ0ODMabnc98rz4toNGoGDx5s9u7da4y5eh6kpaWZixcvevb4b9u2zRw/ftzk5+ebPXv2xD5/q8fbbefCjcbf03XQGG9dC252/I258XXQGG+NHwCcRuG6zqlTp0xGRobp6Ogwxlz942zo0KHm0KFDDiezT3V1tcnPzzfGGDNgwIDYH9/GGFNWVmY2b978T7e5TVdXl7nnnnvMW2+9ZSoqKmJ/aPhh/C0tLSYjI8M0NTV97PM9nfteel5Eo1GTlZVltm3bZowxZu/evWbEiBGmra3N88f/2j+4b/V4u/lcuFHh6HbtddAYb14Lrh//za6Dxnhz/ADgFKYUXuf48eMaPny4UlJSJEmWZSkvL0/Hjh1zOJl9fvKTn+j+++/X2bNn1dHRoWHDhsW2FRQU6NixYz1uc6NVq1bp7rvv1pQpU2Kf88v46+rqlJWVpZUrV2rq1KmaOXOmtm7d2uO576XnhWVZevHFF/Xggw8qPz9fM2bM0PPPP6/m5mZfHP9ut3q8vXQuXKv7Oij551pwo+ug5J/xA0C8pDgdAM5auXKlamtrtXXrVl2+fNnpOHGxb98+rV+/3rf3HXR2duro0aMaP368fvCDH2jPnj2qrKzUq6++6nS0uOjs7NTTTz+tl19+WeFwWNXV1aqqqrrhvS3wh2uvg37h9+sgAMQTr3BdZ+TIkWpoaFBnZ6ckyRijY8eOKS8vz+Fkve+ZZ57Ryy+/rD//+c/q37+/hgwZopSUFJ08eTK2z5EjR5SXl9fjNrfZsWOHjhw5oqKiIhUUFOiNN97Q4sWL9fvf/94X48/Ly1NSUpIWLlwoSZo0aZJGjRqlo0eP3vTc99LzIhKJ6MSJEwqHw5KksrIyBYNB1dTU+OL4d+vpmN7qNje6/jooyRfXwptdB5977jlfjB8A4onCdZ2cnBxNnjxZv/nNbyRJ69evVzAYVGFhocPJeteqVav029/+Vps3b9agQYNin//MZz6j1atXS5Kqq6v14YcfqqKi4p9uc5PHH39cDQ0NOnLkiI4cOaLy8nL94he/0OOPP+6L8WdnZ+uee+7Ra6+9Jkk6fPiwDh8+rLvvvvum576XnhfdheH999+XJNXW1qqurk5jx471xfHv1tMxvdVtbnOz66Dk/WthT9dByfvjB4C4cvIGskS1f/9+U15eboqKisyUKVNMTU2N05F61fHjx40kM3r0aFNSUmJKSkrMnXfeaYwx5uTJk6aystIUFhaa8ePHm9dffz32dT1tc7Nrbxb3y/jr6urMrFmzTHFxsZk4caL5wx/+YIzp+dz30vNi3bp1sbEXFxebF154wRjj3eO/ePFik5uba5KTk01OTo4ZM2aMMebWj7fbzoUbjb+n66Ax3joXbnb8r3X9ohleGj8AOM0yxhinSx8AAAAAeBFTCgEAAADAJhQuAAAAALAJhQsAAAAAbELhAgAAAACbULgAAAAAwCYULgAAAACwCYULwG0rKChQJBLRL3/5S+3fv9+Wn/HUU0/pypUrsY+/853v6IUXXrDlZwEAAPQW3ocLwG0rKCjQhg0btHz5ci1fvlwPPPDAv/T10WhUkpSUdPN/A7IsS+fPn9egQYNuKysAAEA88QoXgF6xZcsWvfXWW/rqV7+q0tJSbdq0SZL0zDPP6M4779TkyZN177336ujRo5KuvmL16U9/WnPnzlVxcbEaGhr05JNPqqysTKWlpQqHwzpw4IAk6Utf+pIkaebMmSotLdXp06f16KOP6sc//rEkqaWlRV/4whdUXFys4uJife9734vlmjVrlp588knNnDlTY8aMiX0vAACAeKBwAegVc+bM0dSpU/WjH/1IkUhE8+bN07p163TgwAHt2rVLb7/9thYuXKgnnngi9jW7du3Sr371K7333nvKzc3VN77xDVVXVysSieiJJ57QV77yFUnS6tWrJUk7duxQJBJRTk7Ox37297//fbW1tammpka7d+/Whg0b9OKLL8a219XV6S9/+Yv27dun1157Tbt27YrD/xEAAAApxekAALxrw4YNqq6u1pQpUyRJXV1dH9s+b948DR06NPbx5s2b9eyzz6q5uVnRaFTnzp37RD9ny5Yt+uEPf6ikpCQNGDBAjzzyiDZv3qyHHnpIkvTQQw8pJSVFKSkpKi0tVV1dnaZNm9ZLowQAALg5ChcA2xhj9M1vflOLFy++4fb09PTYfx87dkxLly5VdXW1xowZo5qaGoXD4Vv6uZZlfezjvn37xv47OTlZnZ2dt/R9AQAA/lVMKQTQawYOHKimpqbYxw888IBWr14de6Wqo6NDe/bsueHXNjU1KTU1VcOHD5cxRj/96U8/tj0jI+Nj3/tac+bM0Zo1a2SM0aVLl/TrX/9an/rUp3ppVAAAALeOwgWg1yxevFgrV66MLZqxcOFCPfroo5o9e7ZKSkpUWlqq119//YZfO2HCBH32s5/VHXfcobKyMuXl5X1s+9e+9jVVVlbGFs241re//W2lpqZqwoQJuuuuu1RVVaUFCxbYNk4AAIBPimXhAQAAAMAmvMIFAAAAADahcAEAAACATShcAAAAAGATChcAAAAA2ITCBQAAAAA2oXABAAAAgE0oXAAAAABgEwoXAAAAANjk/wH7AhRGpQLmFAAAAABJRU5ErkJggg==\n", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "import matplotlib.pyplot as plt\n", + "import re\n", + "\n", + "# Define a function that, for the given tensor name, walks through all \n", + "# the iterations for which we have data and fetches the value.\n", + "# Returns the set of steps and the values\n", + "def get_data(trial, tname):\n", + " tensor = trial.tensor(tname)\n", + " steps = tensor.steps()\n", + " vals = [tensor.value(s) for s in steps]\n", + " return steps, vals\n", + "\n", + "def plot_tensors(trial, collection_name, ylabel=''):\n", + " \"\"\"\n", + " Takes a `trial` and plots all tensors that match the given regex.\n", + " \"\"\"\n", + " plt.figure(\n", + " num=1, figsize=(8, 8), dpi=80,\n", + " facecolor='w', edgecolor='k')\n", + "\n", + " tensors = trial.tensor_names(collection=collection_name)\n", + "\n", + " for tensor_name in sorted(tensors):\n", + " steps, data = get_data(trial, tensor_name)\n", + " plt.plot(steps, data, label=tensor_name)\n", + "\n", + " plt.legend(bbox_to_anchor=(1.04,1), loc='upper left')\n", + " plt.xlabel('Iteration')\n", + " plt.ylabel(ylabel)\n", + " plt.show()\n", + " \n", + "plot_tensors(trial, \"losses\", ylabel=\"Loss\")" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_tensorflow_p36", + "language": "python", + "name": "conda_tensorflow_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} \ No newline at end of file diff --git a/sagemaker-debugger/tensorflow_keras_custom_rule/rules/my_custom_rule.py b/sagemaker-debugger/tensorflow_keras_custom_rule/rules/my_custom_rule.py new file mode 100644 index 0000000000..d402d08bc0 --- /dev/null +++ b/sagemaker-debugger/tensorflow_keras_custom_rule/rules/my_custom_rule.py @@ -0,0 +1,17 @@ +# First Party +from smdebug.rules.rule import Rule + + +class CustomGradientRule(Rule): + def __init__(self, base_trial, threshold=10.0): + super().__init__(base_trial) + self.threshold = float(threshold) + + def invoke_at_step(self, step): + for tname in self.base_trial.tensor_names(collection="gradients"): + t = self.base_trial.tensor(tname) + abs_mean = t.reduction_value(step, "mean", abs=True) + if abs_mean > self.threshold: + return True + return False + \ No newline at end of file diff --git a/sagemaker-debugger/tensorflow_keras_custom_rule/src/tf_keras_resnet_byoc.py b/sagemaker-debugger/tensorflow_keras_custom_rule/src/tf_keras_resnet_byoc.py new file mode 100644 index 0000000000..9e7e8718be --- /dev/null +++ b/sagemaker-debugger/tensorflow_keras_custom_rule/src/tf_keras_resnet_byoc.py @@ -0,0 +1,69 @@ +""" +This script is a ResNet training script which uses Tensorflow's Keras interface. +It has been orchestrated with SageMaker Debugger hooks to allow saving tensors during training. +These hooks have been instrumented to read from json configuration that SageMaker will put in the training container. +Configuration provided to the SageMaker python SDK when creating a job will be passed on to the hook. +This allows you to use the same script with differing configurations across different runs. +If you use an official SageMaker Framework container (i.e. AWS Deep Learning Container), then +you do not have to orchestrate your script as below. Hooks will automatically be added in those environments. +For more information, please refer to https://github.com/awslabs/sagemaker-debugger/blob/master/docs/sagemaker.md +""" + +# Standard Library +import argparse + +# Third Party +import numpy as np +import tensorflow as tf +from tensorflow.keras.applications.resnet50 import ResNet50 +from tensorflow.keras.datasets import cifar10 +from tensorflow.keras.utils import to_categorical +import smdebug.tensorflow as smd + + +def train(batch_size, epoch, model): + (X_train, y_train), (X_valid, y_valid) = cifar10.load_data() + + Y_train = to_categorical(y_train, 10) + Y_valid = to_categorical(y_valid, 10) + + X_train = X_train.astype('float32') + X_valid = X_valid.astype('float32') + + mean_image = np.mean(X_train, axis=0) + X_train -= mean_image + X_valid -= mean_image + X_train /= 128. + X_valid /= 128. + + model.fit(X_train, Y_train, + batch_size=batch_size, + epochs=epoch, + validation_data=(X_valid, Y_valid), + shuffle=True, + callbacks=[hook]) + + +def main(): + parser = argparse.ArgumentParser(description="Train resnet50 cifar10") + parser.add_argument("--batch_size", type=int, default=32) + parser.add_argument("--epoch", type=int, default=3) + parser.add_argument("--model_dir", type=str, default="./model_keras_resnet") + opt = parser.parse_args() + + model = ResNet50(weights=None, input_shape=(32,32,3), classes=10) + + # Create hook from the configuration provided through sagemaker python sdk + hook = smd.KerasHook.create_from_json_file() + opt = tf.keras.optimizers.Adam() + # wrap the optimizer so the hook can identify the gradients + opt = hook.wrap_optimizer(opt) + model.compile(loss='categorical_crossentropy', + optimizer=opt, + metrics=['accuracy']) + + # start the training. + train(opt.batch_size, opt.epoch, model) + +if __name__ == "__main__": + main() diff --git a/sagemaker-debugger/tensorflow_keras_custom_rule/src/tf_keras_resnet_zerocodechange.py b/sagemaker-debugger/tensorflow_keras_custom_rule/src/tf_keras_resnet_zerocodechange.py new file mode 100644 index 0000000000..3492cd7c1d --- /dev/null +++ b/sagemaker-debugger/tensorflow_keras_custom_rule/src/tf_keras_resnet_zerocodechange.py @@ -0,0 +1,61 @@ +""" +This script is a ResNet training script which uses Tensorflow's Keras interface. +It is designed to be used with SageMaker Debugger in an official SageMaker Framework container (i.e. AWS Deep Learning Container). You will notice that this script looks exactly like a normal TensorFlow training script. +The hook needed by SageMaker Debugger to save tensors during training will be automatically added in those environments. +The hook will load configuration from json configuration that SageMaker will put in the training container from the configuration provided using the SageMaker python SDK when creating a job. +For more information, please refer to https://github.com/awslabs/sagemaker-debugger/blob/master/docs/sagemaker.md +""" + +# Standard Library +import argparse +import random +import time + +# Third Party +import numpy as np +from tensorflow.keras.applications.resnet50 import ResNet50 +from tensorflow.keras.datasets import cifar10 +from tensorflow.keras.callbacks import Callback +from tensorflow.keras.optimizers import Adam +from tensorflow.keras.utils import to_categorical + + +def train(batch_size, epoch, model): + (X_train, y_train), (X_valid, y_valid) = cifar10.load_data() + + Y_train = to_categorical(y_train, 10) + Y_valid = to_categorical(y_valid, 10) + + X_train = X_train.astype('float32') + X_valid = X_valid.astype('float32') + + mean_image = np.mean(X_train, axis=0) + X_train -= mean_image + X_valid -= mean_image + X_train /= 128. + X_valid /= 128. + + model.fit(X_train, Y_train, + batch_size=batch_size, + epochs=epoch, + validation_data=(X_valid, Y_valid), + shuffle=True) + + +def main(): + parser = argparse.ArgumentParser(description="Train resnet50 cifar10") + parser.add_argument("--batch_size", type=int, default=128) + parser.add_argument("--epoch", type=int, default=3) + parser.add_argument("--model_dir", type=str, default="./model_keras_resnet") + opt = parser.parse_args() + + model = ResNet50(weights=None, input_shape=(32,32,3), classes=10) + model.compile(loss='categorical_crossentropy', + optimizer='adam', + metrics=['accuracy']) + + # start the training. + train(opt.batch_size, opt.epoch, model) + +if __name__ == "__main__": + main() diff --git a/sagemaker-debugger/tensorflow_keras_custom_rule/tf-keras-custom-rule.ipynb b/sagemaker-debugger/tensorflow_keras_custom_rule/tf-keras-custom-rule.ipynb new file mode 100644 index 0000000000..cbb5257918 --- /dev/null +++ b/sagemaker-debugger/tensorflow_keras_custom_rule/tf-keras-custom-rule.ipynb @@ -0,0 +1,311 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Amazon SageMaker - Debugging with custom rules\n", + "[Amazon SageMaker](https://aws.amazon.com/sagemaker/) is managed platform to build, train and host maching learning models. Amazon SageMaker Debugger is a new feature which offers the capability to debug machine learning models during training by identifying and detecting problems with the models in near real-time. \n", + "\n", + "In this notebook, we'll show you how to use a custom rule to monitor your training job. All through a tf.keras ResNet example.\n", + "\n", + "## How does Amazon SageMaker Debugger work?\n", + "\n", + "Amazon SageMaker Debugger lets you go beyond just looking at scalars like losses and accuracies during training and gives you full visibility into all tensors 'flowing through the graph' during training. Furthermore, it helps you monitor your training in near real-time using rules and provides you alerts, once it has detected inconsistency in training flow.\n", + "\n", + "### Concepts\n", + "* **Tensors**: These represent the state of the training network at intermediate points during its execution\n", + "* **Debug Hook**: Hook is the construct with which Amazon SageMaker Debugger looks into the training process and captures the tensors requested at the desired step intervals\n", + "* **Rule**: A logical construct, implemented as Python code, which helps analyze the tensors captured by the hook and report anomalies, if at all\n", + "\n", + "With these concepts in mind, let's understand the overall flow of things that Amazon SageMaker Debugger uses to orchestrate debugging.\n", + "\n", + "### Saving tensors during training\n", + "\n", + "The tensors captured by the debug hook are stored in the S3 location specified by you. There are two ways you can configure Amazon SageMaker Debugger to save tensors:\n", + "\n", + "#### With no changes to your training script\n", + "If you use one of the Amazon SageMaker provided [Deep Learning Containers](https://docs.aws.amazon.com/sagemaker/latest/dg/pre-built-containers-frameworks-deep-learning.html) for 1.15, then you don't need to make any changes to your training script for the tensors to be stored. Amazon SageMaker Debugger will use the configuration you provide through the Amazon SageMaker SDK's Tensorflow `Estimator` when creating your job to save the tensors in the fashion you specify. You can review the script we are going to use at [src/tf_keras_resnet_zerocodechange.py](src/tf_keras_resnet_zerocodechange.py). You will note that this is an untouched TensorFlow Keras script which uses the `tf.keras` interface. Please note that Amazon SageMaker Debugger only supports `tf.keras`, `tf.estimator` and `tf.MonitoredSession` interfaces for the zero script change experience. Full description of support is available at [Amazon SageMaker Debugger with TensorFlow](https://github.com/awslabs/sagemaker-debugger/tree/master/docs/tensorflow.md)\n", + "\n", + "#### Orchestrating your script to store tensors\n", + " For other containers, you need to make couple of lines of changes to your training script. Amazon SageMaker Debugger exposes a library `smdebug` which allows you to capture these tensors and save them for analysis. It's highly customizable and allows to save the specific tensors you want at different frequencies and possibly with other configurations. Refer [DeveloperGuide](https://github.com/awslabs/sagemaker-debugger/tree/master/docs) for details on how to use Amazon SageMaker Debugger library with your choice of framework in your training script. Here we have an example script orchestrated at [src/tf_keras_resnet_byoc.py](src/tf_keras_resnet_byoc.py). In addition to this, you will need to ensure that your container has the `smdebug` library installed in this case, and specify your container image URI when creating the SageMaker Estimator below. Please refer [SageMaker Documentation](https://sagemaker.readthedocs.io/en/stable/sagemaker.tensorflow.html) on how to do that.\n", + "\n", + "### Analysis of tensors\n", + "\n", + "Amazon SageMaker Debugger can be configured to run debugging ***Rules*** on the tensors saved from the training job. At a very broad level, a rule is Python code used to detect certain conditions during training. Some of the conditions that a data scientist training an algorithm may care about are monitoring for gradients getting too large or too small, detecting overfitting, and so on. Amazon SageMaker Debugger comes pre-packaged with certain built-in rules. Users can write their own rules using the APIs provided by Amazon SageMaker Debugger through the `smdebug` library. You can also analyze raw tensor data outside of the Rules construct in say, a SageMaker notebook, using these APIs. Please refer [Analysis Developer Guide](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/api.md) for more on these APIs.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training TensorFlow Keras models with Amazon SageMaker Debugger\n", + "\n", + "### Amazon SageMaker TensorFlow as a framework\n", + "\n", + "Train a TensorFlow Keras model in this notebook with Amazon Sagemaker Debugger enabled and monitor the training jobs with rules. This is done using Amazon SageMaker [TensorFlow 1.15.0](https://docs.aws.amazon.com/sagemaker/latest/dg/pre-built-containers-frameworks-deep-learning.html) Container as a framework" + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "import os\n", + "import sagemaker\n", + "from sagemaker.tensorflow import TensorFlow" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Import the libraries needed for the demo of Amazon SageMaker Debugger." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.debugger import Rule, DebuggerHookConfig, TensorBoardOutputConfig, CollectionConfig\n", + "import smdebug_rulesconfig as rule_configs" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now define the entry point for the training script" + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "# define the entrypoint script\n", + "entrypoint_script='src/tf_keras_resnet_zerocodechange.py'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Setting up the Estimator\n", + "\n", + "Now it's time to setup our SageMaker TensorFlow Estimator. There are new parameters with the estimator to enable your training job for debugging through Amazon SageMaker Debugger. These new parameters are explained below\n", + "\n", + "* **debugger_hook_config**: This new parameter accepts a local path where you wish your tensors to be written to and also accepts the S3 URI where you wish your tensors to be uploaded to. It also accepts CollectionConfigurations which specify which tensors will be saved from the training job.\n", + "* **rules**: This new parameter will accept a list of rules you wish to evaluate against the tensors output by this training job. For rules, \n", + "\n", + "Amazon SageMaker Debugger supports two types of rules\n", + "* **Amazon SageMaker Rules**: These are rules curated by the Amazon SageMaker team and you can choose to evaluate them against your training job.\n", + "* **Custom Rules**: You can optionally choose to write your own rule as a Python source file and have it evaluated against your training job. To provide SageMaker Debugger to evaluate this rule, you would have to provide the S3 location of the rule source and the evaluator image.\n", + "\n", + "#### Creating your own custom rule\n", + "\n", + "Let us look at how you can create your custom rule briefly before proceeding to use it with your training job. Please see the [documentation](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/analysis.md) to learn more about structuring your rules and other related concepts.\n", + "\n", + "##### **Summary of what the custom rule evaluates**\n", + "For demonstration purposes, below is a rule that tries to track whether gradients are too large. The custom rule looks at the tensors in the collection \"gradients\" saved during training and attempt to get the absolute value of the gradients in each step of the training. If the mean of the absolute values of gradients in any step is greater than a specified threshold, mark the rule as 'triggering'. Let us look at how to structure the rule source.\n", + "\n", + "Any custom rule logic you want to be evaluated should extend the `Rule` interface provided by Amazon SageMaker Debugger\n", + "\n", + "```python\n", + "from smdebug.rules.rule import Rule\n", + "\n", + "class CustomGradientRule(Rule):\n", + "```\n", + "\n", + "Now implement the class methods for the rule. Doing this allows Amazon SageMaker to understand the intent of the rule and evaluate it against your training tensors.\n", + "\n", + "##### Rule class constructor\n", + "\n", + "In order for Amazon SageMaker to instantiate your rule, your rule class constructor must conform to the following signature.\n", + "```python\n", + " def __init__(self, base_trial, other_trials, )\n", + "```\n", + "###### Arguments\n", + "- `base_trial (Trial)`: This defines the primary [Trial](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/analysis.md#trial) that your rule is anchored to. This is an object of class type `Trial`.\n", + "\n", + "- `other_trials (list[Trial])`: *(Optional)* This defines a list of 'other' trials you want your rule to look at. This is useful in the scenarios when you're comparing tensors from the base_trial to tensors from some other trials. \n", + "\n", + "- ``: This is similar to `**kwargs` where you can pass in however many string parameters in your constructor signature. Note that SageMaker would only be able to support supplying string types for these values at runtime (see how, later).\n", + "\n", + "##### Defining the rule logic to be invoked at each step:\n", + "\n", + "This defines the logic to invoked for each step. Essentially, this is where you decide whether the rule should trigger or not. In this case, you're concerned about the gradients getting too large. So, get the [tensor reduction](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/analysis.md#reduction_value) \"mean\" for each step and see if it's value is larger than a threshold.\n", + "\n", + "```python\n", + " def invoke_at_step(self, step):\n", + " for tname in self.base_trial.tensor_names(collection=\"gradients\"):\n", + " t = self.base_trial.tensor(tname)\n", + " abs_mean = t.reduction_value(step, \"mean\", abs=True)\n", + " if abs_mean > self.threshold:\n", + " return True\n", + " return False\n", + "```\n", + "\n", + "#### Using your custom rule with SageMaker Estimator\n", + "\n", + "Below we create the rule configuration using the `Rule.custom` method, and then pass it to the SageMaker TensorFlow estimator to kick off the job. Note that you need to pass the rule evaluator container image for custom rules. Please refer AWS Documentation on SageMaker documentation to find the image URI for your region. We will soon have this be automatically taken care of by the SageMaker SDK. You can also provide your own image, please refer to [this repository](https://github.com/awslabs/sagemaker-debugger-rules-container) for instructions on how to build such a container." + ] + }, + { + "cell_type": "code", + "execution_count": 21, + "metadata": {}, + "outputs": [], + "source": [ + "custom_rule = Rule.custom(\n", + " name='MyCustomRule', # used to identify the rule\n", + " # rule evaluator container image\n", + " image_uri='759209512951.dkr.ecr.us-west-2.amazonaws.com/sagemaker-debugger-rule-evaluator:latest', \n", + " instance_type='ml.t3.medium', # instance type to run the rule evaluation on\n", + " source='rules/my_custom_rule.py', # path to the rule source file\n", + " rule_to_invoke='CustomGradientRule', # name of the class to invoke in the rule source file\n", + " volume_size_in_gb=30, # EBS volume size required to be attached to the rule evaluation instance\n", + " collections_to_save=[CollectionConfig(\"gradients\")], \n", + " # collections to be analyzed by the rule. since this is a first party collection we fetch it as above\n", + " rule_parameters={\n", + " \"threshold\": \"20.0\" # this will be used to intialize 'threshold' param in your constructor\n", + " }\n", + ")\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before you proceed and create our training job, let us take a closer look at the parameters used to create the Rule configuration above:\n", + "\n", + "* `name`: This is used to identify this particular rule among the suite of rules you specified to be evaluated.\n", + "* `image_uri`: This is the image of the container that has the logic of understanding your custom rule sources and evaluating them against the collections you save in the training job. You can get the list of open sourced SageMaker rule evaluator images [here]()\n", + "* `instance_type`: The type of the instance you want to run the rule evaluation on\n", + "* `source`: This is the local path or the Amazon S3 URI of your rule source file.\n", + "* `rule_to_invoke`: This specifies the particular Rule class implementation in your source file which you want to be evaluated. SageMaker supports only 1 rule to be evaluated at a time in a rule job. Your source file can have multiple Rule class implementations, though.\n", + "* `collections_to_save`: This specifies which collections are necessary to be saved for this rule to run. Note that providing this collection does not necessarily mean the rule will actually use these collections. You might want to take such parameters for the rule through the next argument `rule_parameters`.\n", + "* `rule_parameters`: This provides the runtime values of the parameter in your constructor. You can still choose to pass in other values which may be necessary for your rule to be evaluated. Any value in this map is available as an environment variable and can be accessed by your rule script using `$`\n", + "\n", + "You can read more about custom rule evaluation in Amazon SageMaker in this [documentation](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/analysis.md)\n", + "\n", + "Let us now create the estimator and call `fit()` on our estimator to start the training job and rule evaluation job in parallel." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator = TensorFlow(\n", + " role=sagemaker.get_execution_role(),\n", + " base_job_name='smdebug-custom-rule-demo-tf-keras',\n", + " train_instance_count=1,\n", + " train_instance_type='ml.p2.xlarge',\n", + " entry_point=entrypoint_script,\n", + " framework_version='1.15',\n", + " py_version='py3',\n", + " train_max_run=3600,\n", + " script_mode=True,\n", + " ## New parameter\n", + " rules = [custom_rule]\n", + ")\n", + "\n", + "# After calling fit, Amazon SageMaker starts one training job and one rule job for you.\n", + "# The rule evaluation status is visible in the training logs\n", + "# at regular intervals\n", + "\n", + "estimator.fit(wait=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Result \n", + "\n", + "As a result of calling the `fit(wait=False)`, two jobs were kicked off in the background. Amazon SageMaker Debugger kicked off a rule evaluation job for our custom gradient logic in parallel with the training job. You can review the status of the above rule job as follows." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import time\n", + "status = estimator.latest_training_job.rule_job_summary()\n", + "while status[0]['RuleEvaluationStatus'] == 'InProgress':\n", + " status = estimator.latest_training_job.rule_job_summary()\n", + " print(status)\n", + " time.sleep(10)\n", + " " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Once the rule job starts and you see the RuleEvaluationJobArn above, we can see the logs for the rule job in Cloudwatch. To do that, we'll use this utlity function to get a link to the rule job logs." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def _get_rule_job_name(training_job_name, rule_configuration_name, rule_job_arn):\n", + " \"\"\"Helper function to get the rule job name with correct casing\"\"\"\n", + " return \"{}-{}-{}\".format(\n", + " training_job_name[:26], rule_configuration_name[:26], rule_job_arn[-8:]\n", + " )\n", + " \n", + "def _get_cw_url_for_rule_job(rule_job_name, region):\n", + " return \"https://{}.console.aws.amazon.com/cloudwatch/home?region={}#logStream:group=/aws/sagemaker/ProcessingJobs;prefix={};streamFilter=typeLogStreamPrefix\".format(region, region, rule_job_name)\n", + "\n", + "\n", + "def get_rule_jobs_cw_urls(estimator):\n", + " training_job = estimator.latest_training_job\n", + " training_job_name = training_job.describe()[\"TrainingJobName\"]\n", + " rule_eval_statuses = training_job.describe()[\"DebugRuleEvaluationStatuses\"]\n", + " \n", + " result={}\n", + " for status in rule_eval_statuses:\n", + " if status.get(\"RuleEvaluationJobArn\", None) is not None:\n", + " rule_job_name = _get_rule_job_name(training_job_name, status[\"RuleConfigurationName\"], status[\"RuleEvaluationJobArn\"])\n", + " result[status[\"RuleConfigurationName\"]] = _get_cw_url_for_rule_job(rule_job_name, boto3.Session().region_name)\n", + " return result\n", + "\n", + "get_rule_jobs_cw_urls(estimator)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_tensorflow_p36", + "language": "python", + "name": "conda_tensorflow_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/sagemaker-debugger/xgboost_builtin_rules/data_utils.py b/sagemaker-debugger/xgboost_builtin_rules/data_utils.py new file mode 100644 index 0000000000..9d9bcd9b64 --- /dev/null +++ b/sagemaker-debugger/xgboost_builtin_rules/data_utils.py @@ -0,0 +1,43 @@ +import bz2 +import random +import tempfile +import urllib.request + +import boto3 + + +def load_abalone(train_split=0.8, seed=42): + + if not (0 < train_split <= 1): + raise ValueError("'train_split' must be between 0 and 1.") + + url = "https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/regression/abalone" + + response = urllib.request.urlopen(url).read().decode("utf-8") + lines = response.strip().split('\n') + n = sum(1 for line in lines) + indices = list(range(n)) + random.seed(seed) + random.shuffle(indices) + train_indices = set(indices[:int(n * 0.8)]) + + with tempfile.NamedTemporaryFile(mode='w', delete=False) as train_file: + with tempfile.NamedTemporaryFile(mode='w', delete=False) as valid_file: + for idx, line in enumerate(lines): + if idx in train_indices: + train_file.write(line + '\n') + else: + valid_file.write(line + '\n') + + return train_file.name, valid_file.name + + +def write_to_s3(fobj, bucket, key): + return boto3.Session().resource('s3').Bucket(bucket).Object(key).upload_fileobj(fobj) + + +def upload_to_s3(filename, bucket, key): + url = f"s3://{bucket}/{key}" + print(f"Writing to {url}") + with open(filename, "rb") as fobj: + write_to_s3(fobj, bucket, key) diff --git a/sagemaker-debugger/xgboost_builtin_rules/xgboost-regression-debugger-rules.ipynb b/sagemaker-debugger/xgboost_builtin_rules/xgboost-regression-debugger-rules.ipynb new file mode 100644 index 0000000000..e18e231616 --- /dev/null +++ b/sagemaker-debugger/xgboost_builtin_rules/xgboost-regression-debugger-rules.ipynb @@ -0,0 +1,644 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Debugging XGBoost Training Jobs with Amazon SageMaker Debugger Using Rules\n", + "\n", + "This notebook was created and tested on an ml.m5.4xlarge notebook instance.\n", + "\n", + "## Overview\n", + "\n", + "Amazon SageMaker Debugger is a new capability of Amazon SageMaker that allows debugging machine learning training. \n", + "Amazon SageMaker Debugger helps you to monitor your training in near real time using rules and would provide you alerts, once it has detected inconsistency in training. \n", + "\n", + "Using Amazon SageMaker Debugger is a two step process: Saving tensors and Analysis.\n", + "Let's look at each one of them closely.\n", + "\n", + "### Saving tensors\n", + "\n", + "In deep learning algorithms, tensors define the state of the training job at any particular instant in its lifecycle.\n", + "Amazon SageMaker Debugger exposes a library which allows you to capture these tensors and save them for analysis.\n", + "Although XGBoost is not a deep learning algorithm, Amazon SageMaker Debugger is highly customizable and can help provide interpretability by saving insightful metrics, such as performance metrics or feature importances, at different frequencies.\n", + "Refer to [documentation](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/xgboost.md) for details on how to save the metrics you want.\n", + "\n", + "\n", + "### Analysis\n", + "\n", + "After the tensors are saved, perform automatic analysis by running debugging ***Rules***.\n", + "On a very broad level, a rule is Python code used to detect certain conditions during training.\n", + "Some of the conditions that a data scientist training an algorithm may care about are monitoring for gradients getting too large or too small, detecting overfitting, and so on.\n", + "Amazon SageMaker Debugger comes pre-packaged with certain rules that can be invoked on Amazon SageMaker. Users can also write their own rules using the Amazon SageMaker Debugger APIs. \n", + "For more information about automatic analysis using a rule, see the [rules documentation](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/analysis.md)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "! python -m pip install smdebug" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "import sagemaker" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Amazon SageMaker Debugger is available in Amazon SageMaker XGBoost container version 0.90-2 or later. If you want to use XGBoost with Amazon SageMaker Debugger, you have to specify `repo_version='0.90-2'` in the `get_image_uri` function." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.amazon.amazon_estimator import get_image_uri\n", + "\n", + "# Below changes the Region to be one where this notebook is running\n", + "region = boto3.Session().region_name\n", + "\n", + "container = get_image_uri(region, \"xgboost\", repo_version=\"0.90-2\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training XGBoost models in Amazon SageMaker with Amazon SageMaker Debugger\n", + "\n", + "Now train an XGBoost model with Amazon SageMaker Debugger enabled and monitor the training jobs. This is done using the Amazon SageMaker Estimator API. While the training job is running, use Amazon SageMaker Debugger API to access saved tensors in real time and visualize them. You can rely on Amazon SageMaker Debugger to take care of downloading a fresh set of tensors every time you query for them.\n", + "\n", + "This example is adapted from [XGBoost for Regression](https://github.com/awslabs/amazon-sagemaker-examples/tree/master/introduction_to_amazon_algorithms/xgboost_abalone). Refer to [XGBoost for Regression](https://github.com/awslabs/amazon-sagemaker-examples/tree/master/introduction_to_amazon_algorithms/xgboost_abalone) for an example of using classification from Amazon SageMaker's implementation of [XGBoost](https://github.com/dmlc/xgboost).\n", + "\n", + "### Data preparation\n", + "\n", + "Use the [Abalone data](https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/regression.html) originally from the [UCI data repository](https://archive.ics.uci.edu/ml/datasets/abalone).\n", + "More details about the original dataset can be found [here](https://archive.ics.uci.edu/ml/machine-learning-databases/abalone/abalone.names).\n", + "In the libsvm converted [version](https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/regression.html), the nominal feature (Male/Female/Infant) has been converted into a real valued feature.\n", + "Age of abalone is to be predicted from eight physical measurements.\n", + "\n", + "Following methods download the Abalone data, split the data into training and validation\n", + "sets, and upload files to Amazon Simple Storage Service (Amazon S3)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from data_utils import load_abalone, upload_to_s3\n", + "\n", + "bucket = sagemaker.Session().default_bucket()\n", + "prefix = \"DEMO-smdebug-xgboost-abalone\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%time\n", + "\n", + "train_file, validation_file = load_abalone()\n", + "upload_to_s3(train_file, bucket, f\"{prefix}/train/abalone.train.libsvm\")\n", + "upload_to_s3(validation_file, bucket, f\"{prefix}/validation/abalone.validation.libsvm\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker import get_execution_role\n", + "\n", + "role = get_execution_role()\n", + "base_job_name = \"demo-smdebug-xgboost-classification\"\n", + "bucket_path = 's3://{}'.format(bucket)\n", + "\n", + "hyperparameters={\n", + " \"max_depth\": \"5\",\n", + " \"eta\": \"0.2\",\n", + " \"gamma\": \"4\",\n", + " \"min_child_weight\": \"6\",\n", + " \"subsample\": \"0.7\",\n", + " \"silent\": \"0\",\n", + " \"objective\": \"reg:squarederror\",\n", + " \"num_round\": \"51\",\n", + "}\n", + "\n", + "save_interval = 5\n", + "\n", + "base_job_name = \"demo-smdebug-xgboost-regression\"" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Enabling Debugger in Estimator object\n", + "\n", + "\n", + "#### DebuggerHookConfig\n", + "\n", + "Enabling Amazon SageMaker Debugger in training job can be accomplished by adding its configuration into Estimator object constructor:\n", + "\n", + "```python\n", + "from sagemaker.debugger import DebuggerHookConfig, CollectionConfig\n", + "\n", + "estimator = Estimator(\n", + " ...,\n", + " debugger_hook_config = DebuggerHookConfig(\n", + " s3_output_path=\"s3://{bucket_name}/{location_in_bucket}\", # Required\n", + " collection_configs=[\n", + " CollectionConfig(\n", + " name=\"metrics\",\n", + " parameters={\n", + " \"save_interval\": \"10\"\n", + " }\n", + " )\n", + " ]\n", + " )\n", + ")\n", + "```\n", + "Here, the `DebuggerHookConfig` object instructs `Estimator` what data we are interested in.\n", + "Two parameters are provided in the example:\n", + "\n", + "- `s3_output_path`: it points to S3 bucket/path where we intend to store our debugging tensors.\n", + " Amount of data saved depends on multiple factors, major ones are: training job / data set / model / frequency of saving tensors.\n", + " This bucket should be in your AWS account, and you should have full access control over it.\n", + " **Important Note**: this s3 bucket should be originally created in the same region where your training job will be running, otherwise you might run into problems with cross region access.\n", + "\n", + "- `collection_configs`: it enumerates named collections of tensors we want to save.\n", + " Collections are a convinient way to organize relevant tensors under same umbrella to make it easy to navigate them during analysis.\n", + " In this particular example, you are instructing Amazon SageMaker Debugger that you are interested in a single collection named `metrics`.\n", + " We also instructed Amazon SageMaker Debugger to save metrics every 10 iteration.\n", + " See [Collection](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/api.md#collection) documentation for all parameters that are supported by Collections and DebuggerConfig documentation for more details about all parameters DebuggerConfig supports.\n", + " \n", + "#### Rules\n", + "\n", + "Enabling Rules in training job can be accomplished by adding the `rules` configuration into Estimator object constructor.\n", + "\n", + "- `rules`: This new parameter will accept a list of rules you wish to evaluate against the tensors output by this training job.\n", + " For rules, Amazon SageMaker Debugger supports two types:\n", + " - SageMaker Rules: These are rules specially curated by the data science and engineering teams in Amazon SageMaker which you can opt to evaluate against your training job.\n", + " - Custom Rules: You can optionally choose to write your own rule as a Python source file and have it evaluated against your training job.\n", + " To provide Amazon SageMaker Debugger to evaluate this rule, you would have to provide the S3 location of the rule source and the evaluator image.\n", + "\n", + "In this example, you will use a Amazon SageMaker's LossNotDecreasing rule, which helps you identify if you are running into a situation where the training loss is not going down.\n", + "\n", + "```python\n", + "from sagemaker.debugger import rule_configs, Rule\n", + "\n", + "estimator = Estimator(\n", + " ...,\n", + " rules=[\n", + " Rule.sagemaker(\n", + " rule_configs.loss_not_decreasing(),\n", + " rule_parameters={\n", + " \"collection_names\": \"metrics\",\n", + " \"num_steps\": \"10\",\n", + " },\n", + " ),\n", + " ],\n", + ")\n", + "```\n", + "\n", + "- `rule_parameters`: In this parameter, you provide the runtime values of the parameter in your constructor.\n", + " You can still choose to pass in other values which may be necessary for your rule to be evaluated.\n", + " In this example, you will use Amazon SageMaker's LossNotDecreasing rule to monitor the `metircs` collection.\n", + " The rule will alert you if the tensors in `metrics` has not decreased for more than 10 steps." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.debugger import rule_configs, Rule, DebuggerHookConfig, CollectionConfig\n", + "from sagemaker.estimator import Estimator\n", + "\n", + "algorithm_mode_default_estimator = Estimator(\n", + " role=role,\n", + " base_job_name=base_job_name,\n", + " train_instance_count=1,\n", + " train_instance_type='ml.m5.xlarge',\n", + " image_name=container,\n", + " hyperparameters=hyperparameters,\n", + " train_max_run=1800,\n", + "\n", + " debugger_hook_config=DebuggerHookConfig(\n", + " s3_output_path=bucket_path, # Required\n", + " collection_configs=[\n", + " CollectionConfig(\n", + " name=\"metrics\",\n", + " parameters={\n", + " \"save_interval\": str(save_interval)\n", + " }\n", + " ),\n", + " CollectionConfig(\n", + " name=\"feature_importance\",\n", + " parameters={\n", + " \"save_interval\": str(save_interval)\n", + " }\n", + " ),\n", + " CollectionConfig(\n", + " name=\"average_shap\",\n", + " parameters={\n", + " \"save_interval\": str(save_interval)\n", + " }\n", + " ),\n", + " ],\n", + " ),\n", + "\n", + " rules=[\n", + " Rule.sagemaker(\n", + " rule_configs.loss_not_decreasing(),\n", + " rule_parameters={\n", + " \"collection_names\": \"metrics\",\n", + " \"num_steps\": str(save_interval * 2),\n", + " },\n", + " ),\n", + " ],\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "With the next step, start a training job by using the Estimator object you created above. This job is started in an asynchronous, non-blocking way. This means that control is passed back to the notebook and further commands can be run while the training job is progressing." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.session import s3_input\n", + "\n", + "train_s3_input = s3_input(\"s3://{}/{}/{}\".format(bucket, prefix, \"train\"), content_type=\"libsvm\")\n", + "validation_s3_input = s3_input( \"s3://{}/{}/{}\".format(bucket, prefix, \"validation\"), content_type=\"libsvm\")\n", + "algorithm_mode_default_estimator.fit(\n", + " {\"train\": train_s3_input, \"validation\": validation_s3_input},\n", + " # This is a fire and forget event. By setting wait=False, you just submit the job to run in the background.\n", + " # Amazon SageMaker starts one training job and release control to next cells in the notebook.\n", + " # Follow this notebook to see status of the training job.\n", + " wait=False\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Result\n", + "\n", + "As a result of the above command, Amazon SageMaker starts one training job and one rule job for you. The first one is the job that produces the tensors to be analyzed. The second one analyzes the tensors to check if `train-rmse` and `validation-rmse` are not decreasing at any point during training.\n", + "\n", + "Check the status of the training job below.\n", + "After your training job is started, Amazon SageMaker starts a rule-execution job to run the LossNotDecreasing rule.\n", + "\n", + "**Note that the next cell blocks until the rule execution job ends. You can stop it at any point to proceed to the rest of the notebook. Once it says Rule Evaluation Status is Started, and shows the `RuleEvaluationJobArn`, you can look at the status of the rule being monitored.**" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import time\n", + "\n", + "for _ in range(360):\n", + " job_name = algorithm_mode_default_estimator.latest_training_job.name\n", + " client = algorithm_mode_default_estimator.sagemaker_session.sagemaker_client\n", + " description = client.describe_training_job(TrainingJobName=job_name)\n", + " training_job_status = description[\"TrainingJobStatus\"]\n", + " rule_job_summary = algorithm_mode_default_estimator.latest_training_job.rule_job_summary()\n", + " rule_evaluation_status = rule_job_summary[0][\"RuleEvaluationStatus\"]\n", + " print(\"Training job status: {}, Rule Evaluation Status: {}\".format(training_job_status, rule_evaluation_status))\n", + "\n", + " if rule_evaluation_status in [\"Stopped\", \"IssuesFound\", \"NoIssuesFound\"]:\n", + " break\n", + "\n", + " time.sleep(10)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Check the status of the Rule Evaluation Job\n", + "\n", + "To get the rule evaluation job that Amazon SageMaker started for you, run the command below. The results show you the `RuleConfigurationName`, `RuleEvaluationJobArn`, `RuleEvaluationStatus`, `StatusDetails`, and `RuleEvaluationJobArn`.\n", + "If the tensors meets a rule evaluation condition, the rule execution job throws a client error with `RuleEvaluationConditionMet`.\n", + "\n", + "The logs of the rule evaluation job are available in the Cloudwatch Logstream `/aws/sagemaker/ProcessingJobs` with `RuleEvaluationJobArn`.\n", + "\n", + "You can see that once the rule execution job starts, it identifies the loss not decreasing situation in the training job, it raises the `RuleEvaluationConditionMet` exception, and it ends the job." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "algorithm_mode_default_estimator.latest_training_job.rule_job_summary()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Making this a good run\n", + "\n", + "In previous example, you saw how a LossNotDecreasing rule was run that analyzed the tensors when training was running and produced an alert.\n", + "\n", + "You can go back and change the hyperparameters passed to the estimator to `hyperparameters` and start a new training job (e.g., use a smaller learning rate `eta=0.05`). You can see that the LossNotDecreasing rule is not fired in that case as both `train-rmse` and `validation-rmse` keep decreasing steadily throughout the entire training duration." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Data Analysis - Manual\n", + "\n", + "Now that you've trained the system, analyze the data.\n", + "Here, you focus on after-the-fact analysis.\n", + "\n", + "You import a basic analysis library, which defines the concept of trial, which represents a single training run." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from smdebug.trials import create_trial\n", + "\n", + "s3_output_path = algorithm_mode_default_estimator.latest_job_debugger_artifacts_path()\n", + "trial = create_trial(s3_output_path)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can list all the tensors that you know something about. Each one of these names is the name of a tensor. The name is a combination of the feature name, which in these cases, is auto-assigned by XGBoost, and whether it's an evaluation metric, feature importance, or SHAP value." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "trial.tensor_names()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For each tensor, ask for the steps where you have data. In this case, every five steps" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "trial.tensor(\"train-rmse\").steps()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can obtain each tensor at each step as a NumPy array." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "type(trial.tensor(\"train-rmse\").value(10))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Performance metrics\n", + "\n", + "You can also create a simple function that visualizes the training and validation errors as the training progresses.\n", + "Each gradient should get smaller over time, as the system converges to a good solution.\n", + "Remember that this is an interactive analysis. You are showing these tensors to give an idea of the data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "import re\n", + "\n", + "\n", + "def get_data(trial, tname):\n", + " \"\"\"\n", + " For the given tensor name, walks though all the iterations\n", + " for which you have data and fetches the values.\n", + " Returns the set of steps and the values.\n", + " \"\"\"\n", + " tensor = trial.tensor(tname)\n", + " steps = tensor.steps()\n", + " vals = [tensor.value(s) for s in steps]\n", + " return steps, vals\n", + "\n", + "def plot_collection(trial, collection_name, regex='.*', figsize=(8, 6)):\n", + " \"\"\"\n", + " Takes a `trial` and a collection name, and \n", + " plots all tensors that match the given regex.\n", + " \"\"\"\n", + " fig, ax = plt.subplots(figsize=figsize)\n", + " sns.despine()\n", + "\n", + " tensors = trial.collection(collection_name).tensor_names\n", + "\n", + " for tensor_name in sorted(tensors):\n", + " if re.match(regex, tensor_name):\n", + " steps, data = get_data(trial, tensor_name)\n", + " ax.plot(steps, data, label=tensor_name)\n", + "\n", + " ax.legend(loc='center left', bbox_to_anchor=(1, 0.5))\n", + " ax.set_xlabel('Iteration')" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAlEAAAF3CAYAAACFYR5oAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xt8XXWd7//3d19zaZpekmaHtPRCGtqkSYotxbYgStWD0gFB0Yo6OAyiw1FGR8ejHM7MaJnf4BnkjCIoDCoyFBURRkUP3gYK4k+g1F7TFNI2SZtLk/SSW3PZl+/5Y++0aZo0yc5l7cvr+Xjksdflu9b+ZJGkb77ru7/LWGsFAACA8XE5XQAAAEAyIkQBAADEgRAFAAAQB0IUAABAHAhRAAAAcSBEAQAAxIEQBQAAEAdCFAAAQBwIUQAAAHEgRAEAAMTBMxUnvfrqq+1zzz03FacGACARGacLwPSbkp6otra2qTgtAABAwuB2HgAAQBwIUQAAAHEgRAEAAMSBEAUAABAHQhQAAEAcCFEAAABxIEQBAADEgRAFAAAQB0IUAABAHAhRAAAAcSBEAQAAxCGhQ5S1Vq/VHtfexnanSwEAADhLQocoY4w+/cR2ffelQ06XAgAAcJaEDlGSVFo4U1VNHU6XAQAAcJYxhShjzOeMMXuNMXuMMT80xmRMdWEDlhfOVE1Ll/pC4el6SwAAgFGNGqKMMUWS7pC02lq7QpJb0qapLmxA6QUzFYpY1bR0TddbAgAAjGqst/M8kjKNMR5JWZIap66ksy0vnClJqmrklh4AAEgco4Yoa22DpHsl1UtqktRurf3NVBc2YNHcbGV63drX1DldbwkAADCqsdzOmy3pOkmLJV0gKdsY89Fh2t1mjNlmjNnW2to6aQW6XUYXB3JU1cQ0BwAAIHGM5XbeOyUdsta2WmuDkp6WtG5oI2vtw9ba1dba1fn5+ZNX4anjWpPXr31NnbLWTt55AQAAJmAsIape0luNMVnGGCNpg6R9U1tWjLXS/at0Y+d/qL0nqKb23ml5WwAAgNGMZUzUK5KekrRd0u7YMQ9PcV1RxkiBcl3Q+6YkBpcDAIDEMaZP51lr/9Fau8xau8Ja+zFrbd9UF3ZaoFxZJ/bLY8Lax6SbAAAgQST8jOUqrJQJ9+mKWceZuRwAACSMxA9RgQpJ0pUzm+iJAgAACSPxQ9TcYsmToUrPYdUeO6WuvpDTFQEAACRBiHJ7pIIyLQwekCTtb6Y3CgAAOC/xQ5QkBco1q71aklUVM5cDAIAEkCQhqkKuvpNaltHONAcAACAhJEeIKqyUJL1r9lEGlwMAgISQHCFqXqlkXFqdcUTVzR0KR3j8CwAAcFZyhChfljR3qZZGDqo3GFHtsW6nKwIAAGkuOUKUJAXKldf9hiQe/wIAAJyXPCGqsEK+rgbluboYFwUAAByXPCEqNnP5O2e3EKIAAIDjki5Erc1u5Bl6AADAcckTorLnSjOLVOaq1dGOPh3r6nO6IgAAkMaSJ0RJUqBcF/S8KUnax8zlAADAQUkWoiqU2XFAfvUzLgoAADgquUJUYYWMjWj9jKOMiwIAAI5KrhAVKJckvW1mEz1RAADAUckVomYtlPy5WumtV01Ll/pCYacrAgAAaSq5QpQxUqBcC/trFIpYvXm0y+mKAABAmkquECVJhRXK7XxDLkW4pQcAAByTfCEqUCFXqFfLvQwuBwAAzknCEBUdXH7V7KP0RAEAAMckX4jKv1hy+3Sp/4iqGjtkrXW6IgAAkIaSL0S5vdK85VoaOaiO3pAa23udrggAAKSh5AtRkhSoUF7XG5Ks9jVySw8AAEy/5AxRhZXy9h1XoTnO4HIAAOCI5AxRscHlb89lcDkAAHBGcoaogjJJRuuyGuiJAgAAjkjOEOXPkeYsUampVd2xU+rqCzldEQAASDPJGaIkqbBCF/S+KUmqpjcKAABMs+QNUYEKZXYf0Ux1My4KAABMu6QOUZJ0aSbjogAAwPRL3hBVGA1RV+Y0qaqp0+FiAABAukneEDVjnjSjQJXeeu1v7lA4wuNfAADA9EneECVJgQotDB5QbzCiQ23dTlcDAADSSHKHqMIK5XYelE9BBpcDAIBpNWqIMsZcbIzZMeirwxjz2ekoblSBchkb0nI3g8sBAMD08ozWwFq7X9JKSTLGuCU1SHpmiusam9gn9N6e26ydhCgAADCNxns7b4OkA9bauqkoZtxmL5Z8ObrUf1hVjYQoAAAwfcYbojZJ+uFUFBIXl0sKrFBx5JBaOvt0rKvP6YoAAECaGHOIMsb4JF0r6Scj7L/NGLPNGLOttbV1suobXaBC+d1vyiiifcwXBQAApsl4eqLeI2m7tfbocDuttQ9ba1dba1fn5+dPTnVjESiXO9Stheaoqprap+99AQBAWhtPiPqwEulW3oDYzOWXZzfSEwUAAKbNmEKUMSZb0rskPT215cQhf5nk8mhtdgODywEAwLQZU4iy1nZba+daaxPvfpnHL+UvV6mp04HWLvUGw05XBAAA0kByz1g+oLBChT1vKhSxqmnpcroaAACQBlIjRAXKldHXpnydZOZyAAAwLVIkREUHl7/FV88z9AAAwLRIkRC1QpJ0RU4Tg8sBAMC0SI0QlZErzV6kSk+0J8pa63RFAAAgxaVGiJKkQIUWBg+oozekhpM9TlcDAABSXEqFqJmn6pWtHibdBAAAUy51QlRs5vJSVx3jogAAwJRLnRAVKJckXTGjiU/oAQCAKZc6ISqnUMrK0+qMI9rXTIgCAABTK3VClDFSYYWKIwdVd+yUOnuDTlcEAABSWOqEKEkKlCvv1EF5FdL+ZgaXAwCAqZNiIapCrkhQxaaBx78AAIAplXIhSpIuzTjM4HIAADClUitEzb1I8mZpbVYj0xwAAIAplVohyuWWClao1NRq/9FOhSM8/gUAAEyN1ApRkhQo1wW9NeoNhnWordvpagAAQIpKvRBVWCFvqEsLTAuDywEAwJRJvRAVm7m80l3H4HIAADBlUi9EzSuTjFvrZjQxuBwAAEyZ1AtR3gwp/2Kt9NTTEwUAAKZM6oUoSQqUa2HwgFo6+9TW1ed0NQAAIAWlaIiqUHZfi+aog94oAAAwJVI0REUHl5e5aglRAABgSqR0iLoss4HB5QAAYEqkZojKmiPlXqhLM45oX1On09UAAIAUlJohSpIC5SqOHFJNa5d6g2GnqwEAACkmdUNUYYXm9NTJF+lRTUuX09UAAIAUk7ohKlAuI6vlpp5xUQAAYNKlcIiqkCRVeg/zDD0AADDpUjdE5c6XMmdrbVYD0xwAAIBJl7ohyhgpUK5SU6eqpg5Za52uCAAApJDUDVGSFKhQoO+gTvX2qeFkj9PVAACAFJLyIcoT6dNFppHB5QAAYFKldogqjA4uL3PVMekmAACYVKkdouYulTwZWpvVoKqmdqerAQAAKSS1Q5TbI80rVaWnnp4oAAAwqcYUoowxs4wxTxljqo0x+4wxa6e6sElTWKGFwQOqP96tzt6g09UAAIAUMdaeqG9Ies5au0xSpaR9U1fSJAuUKyPUoSK1qbqZ3igAADA5Rg1RxphcSW+T9F1Jstb2W2tPTnVhkyZQKUkqddUx6SYAAJg0Y+mJWiypVdL3jTF/NsY8YozJnuK6Jk9BmaxxaZXvMNMcAACASTOWEOWR9BZJ37bWXiKpW9KXhjYyxtxmjNlmjNnW2to6yWVOgC9LZm6xLs04TE8UAACYNGMJUUckHbHWvhJbf0rRUHUWa+3D1trV1trV+fn5k1njxAUqVBypVXVzp0LhiNPVAACAFDBqiLLWNks6bIy5OLZpg6SqKa1qsgXKldvfrMxQu2qPdTtdDQAASAGeMbb7jKQtxhifpIOS/mrqSpoCsZnLl7vqtbexQ8XzchwuCAAAJLsxhShr7Q5Jq6e4lqkTiIaoCnet9jV16rqVDtcDAACSXmrPWD4gO0/KuUCXZTYwuBwAAEyK9AhRklRYoeWmTlWEKAAAMAnSJ0QFylXQX6+Ozk61dvY5XQ0AAEhyaRSiKuSyYV1smC8KAABMXPqEqNgn9MpctYQoAAAwYekTomYtlPy5utR/hHFRAABgwtInRBkjBcpV6a2nJwoAAExY+oQoSQqU68LgIR1q7VRvMOx0NQAAIImlV4gqrJA30qsLbaPePNrldDUAACCJpVeIis1cXmbquKUHAAAmJL1CVP7Fsm6fVnqZdBMAAExMeoUot1dm3nK9hU/oAQCACUqvECVJgXItDR/SvqZ2WWudrgYAACSpNAxRlcoOn1RWb6uOnOhxuhoAAJCk0i9EMXM5AACYBOkXogrKZGW0wlXLuCgAABC39AtR/hyZOUu0OqOBnigAABC39AtRkhQoV6mhJwoAAMQvPUNUYYXygk1qP96mjt6g09UAAIAklJ4hKlApSVpu6lXd1OlwMQAAIBmlaYgql8Qn9AAAQPzSM0TlFMjOKNBK32FCFAAAiEt6hihJJlCuSg/P0AMAAPFJ2xClQIXmh+p1sPm4QuGI09UAAIAkk74hqrBCbhvWwnC9DrV1O10NAABIMukbogJnHv/CLT0AADBe6RuiZi+W9c1QuaueEAUAAMYtfUOUyyVTsEJv8R3WPuaKAgAA45S+IUqSCit0kT2k6saTTlcCAACSjMfpAhwVqFBGpEdZPfVq7exTfo7f6YoAACni9ddfn+fxeB6RtELp3mmRnCKS9oRCoVtXrVrVMlyDNA9RsZnLTZ32NXUoPyff4YIAAKnC4/E8EggElufn559wuVzW6XowPpFIxLS2tpY2Nzc/Iuna4dqkdzKet1zW5VEpn9ADAEy+Ffn5+R0EqOTkcrlsfn5+u6I9icO3mcZ6Eo/HL5O/LDa4nBAFAJhULgJUcov99xsxK6V3iJKkQIWWmzpVNRKiAACpo62tzX3PPfeMe5zKlVdeWdzW1uaeippSDSGqsEKzwsfV0dag3mDY6WoAAJgUx44dc3/3u9+dN3R7MBg873Fbt26tycvLi+sfxEgkonA4ff4tJUTFBpcv1yG9ebTL4WIAAJgcn//85+cfPnzYv2zZstIVK1YsX7Vq1cVXXXVV8dKlS1dI0jvf+c6LysrKlhcXF5fde++9eQPHFRUVlTc1NXn279/vW7JkSdmmTZsWFhcXl61fv35pV1eXGfo++/fv9y1atGjF9ddfv6ikpKTswIEDvqysrEs++clPzi8uLi5bt25dyfPPP5+1Zs2ai+fPn1++ZcuWXEnatm1bRnl5+fJly5aVlpSUlO7evdsvSQ8++OCcge033XTTwlAoNF2XbNwIUbEQVWrqVNXU7nAxAABMjq9//etHFixY0FddXV11zz33HKmqqsp68MEH62tra/dI0pYtW2r37t27b8eOHVUPPfRQQXNz8zm38Orr6zPuuOOOlpqamr25ubnhxx57bPZw71VfX+//9Kc/3VpTU7O3pKSkv6enx7Vhw4aOmpqavdnZ2eG77rqr6KWXXnrjJz/5Sc3mzZuLJOn+++/Pv/32249WV1dX7dq1a9/ixYv7t2/fnvHUU0/N2bZtW3V1dXWVy+Wy3/nOd+ZO7ZWK35imODDG1ErqlBSWFLLWrp7KoqZVRq7srIWqOFGnPzFzOQBgCvz9UzsXvNHcmTWZ5ywJ5Jz61w9UHh5r+4qKiu5ly5b1D6x/7WtfK/jlL385S5Kam5u9e/fuzQgEAt2DjykqKupbt25djyRdcsklp2pra4edULGwsLB/w4YNp4/1er32Ax/4QIcklZWV9fj9/ojf77dr1qzpaWho8EnS2rVru++9997CI0eO+DZt2nSivLy877nnnsvZs2dPVmVl5XJJ6u3tdc2bNy9hu6LGM0/UO6y1bVNWiYNMYYUqOrfrewwuBwCkqKysrMjA8rPPPpuzdevWnG3btlXn5ORE1qxZc3FPT885d6d8Pt/pTxe63W7b09Pjqqmp8W7cuHGpJN1yyy2t1113Xfvgc0uSx+OxLlf0dC6XS36/38bOoXA4bCTpU5/61PErrrii+5lnnsnduHHj0vvvv7/OWmtuvPHGYw888EDDlFyESZbek20OCFTqgn2/UF3TUVlrZcw5t3wBAIjbeHqMJktubm64u7t72GE7J0+edOfm5oZzcnIif/7znzN27tyZPdbzFhcXB6urq6sG1vfv3++Lp76qqirf8uXL+8rKylrq6+t9O3bsyLzmmms6brjhhuI777zzaFFRUejo0aPu9vZ2d0lJSf/oZ5x+Yw1RVtJvjDFW0kPW2oensKbpFxsXtaD/gI6c6NGCOZPa4woAwLQLBALhVatWdS1durTM7/dH8vPzT38s7/3vf3/7ww8/nL9kyZKyJUuW9FZWVnaf71xT4fHHH5/z5JNPzvV4PDY/Pz+4efPmpoKCgvBdd93VsGHDhpJIJCKv12u/+c1v1idqiDLWjj4PmDGmyFrbYIyZJ+m3kj5jrX1xSJvbJN0mSRdeeOGqurq6qah3anQ0Svct1z8Eb9b6m+7UfysLOF0RACC5nHMLY+fOnbWVlZUpOQwmnezcuTOvsrJy0XD7xvTpPGttQ+y1RdIzktYM0+Zha+1qa+3q/PwkewZdTqFs5lytcDHpJgAAGJtRQ5QxJtsYkzOwLOndkvZMdWHTyhiZwgqt9PL4FwAAMDZj6YkqkPQHY8xOSa9K+qW19rmpLcsBhRVabOv1RtNxpysBAABJYNSB5dbag5Iqp6EWZwUq5LVBZZ6sUUfvOzQzw+t0RQAAIIExY/mAQIWk6Mzl1Uy6CQAARkGIGjD3IllPpspctapq5PEvAADg/AhRA1xuKbBCFZ567aMnCgCQZrKysi6RpNraWu/VV1+9ZLg2a9asufjFF18872SKX/3qV+d1dnaezhdXXnllcVtb2znP5UsFhKhBTKBCy00dPVEAgLS1aNGi4HPPPXcw3uMfeuihgq6urtP5YuvWrTV5eXnhyakuKhKJKBye1FPGhRA1WKBc2bZbp1oOKBSOjN4eAIAEdfvttxf9y7/8y+mJG//u7/7ugi9+8YuFa9euLSktLV1eUlJS+vjjj88aetz+/ft9S5cuLZOkrq4us3HjxiVLliwpe9e73nVRb2/v6UlFP/KRj1y4YsWK5cXFxWWf+9znLpCku+++e15LS4v3yiuvLLnssstKJKmoqKi8qanJI0n/9E//VLB06dKypUuXln31q1+dN/B+S5YsKdu0adPC4uLisvXr1y/t6uo6Z/LS/fv3+xYtWrTi+uuvX1RSUlJ24MABX1ZW1iWf/OQn5xcXF5etW7eu5Pnnn89as2bNxfPnzy/fsmVLriRt27Yto7y8fPmyZctKS0pKSnfv3u2XpAcffHDOwPabbrppYSg0/uccE6IGK4wOLi+OHNKhtmmfAR8AgEnzkY985PjTTz89Z2D9Zz/72ezbbrut7Ze//GVNVVXVvq1bt75x5513zo9ERu40uPfee+dlZmZGDh48uPfuu+9urKqqOv2Mvfvuu69hz549+6qrq/e+/PLLOa+88krmXXfd1TJv3rzg1q1b33jllVfeGHyul156KeuJJ56Y+/rrr+/btm3bvsceeyz/5ZdfzpSk+vr6jDvuuKOlpqZmb25ubvixxx6bPVw99fX1/k9/+tOtNTU1e0tKSvp7enpcGzZs6KipqdmbnZ0dvuuuu4peeumlN37yk5/UbN68uUiS7r///vzbb7/9aHV1ddWuXbv2LV68uH/79u0ZTz311Jxt27ZVV1dXV7lcLvud73xn7nivMQ8gHmxeqaxxRweXN3VoaUGO0xUBAFLBf/73BWqpmtwHs84rPaX3PTDig43Xr1/fc+zYMU9tba23qanJk5ubG16wYEHoE5/4xII//elPM1wul1paWnxHjhzxXHjhhcN2w/zhD3+Ycccdd7RI0mWXXdZTUlJyamDfD37wgzmPPvpoXigUMq2trd6dO3dmXHbZZT0j1fPCCy/MeO9733ty5syZEUm65pprTjz//PM5N95448mioqK+devW9UjSJZdccqq2ttY/3DkKCwv7N2zYcLqXw+v12g984AMdklRWVtbj9/sjfr/frlmzpqehocEnSWvXru2+9957C48cOeLbtGnTifLy8r7nnnsuZ8+ePVmVlZXLJam3t9c1b968cXdFEaIG82ZKeUu14mi9Xmnq0HUri5yuCACAuF177bUnHn/88dnNzc3eG2644fhDDz0059ixY57du3fv8/v9tqioqLynp2fcd6Wqq6t93/rWtwpef/31ffn5+eH3v//9i3p7e+O+u+Xz+U4/yNftdtuenh5XTU2Nd+PGjUsl6ZZbbmm97rrr2rOyss7qNvN4PNblir6ty+WS3++3sXMoHA4bSfrUpz51/Iorruh+5plncjdu3Lj0/vvvr7PWmhtvvPHYAw880BBvzRIh6hymsFIVbb/T93mGHgBgspynx2gqffSjHz3+iU98YtGJEyc8W7du3f/YY4/NzsvLC/r9fvuLX/wip7Gx0Xe+4y+//PKuLVu2zLn22ms7X3vttYw33ngjS5JOnDjhzszMjMyZMyd8+PBhzwsvvJB75ZVXdkpSdnZ2uL293VVYWHjWud7xjnd03XLLLYs2b97cbK3Vr371q9mPPvroiAPYi4uLg9XV1VUD6/v37z9vrSOpqqryLV++vK+srKylvr7et2PHjsxrrrmm44Ybbii+8847jxYVFYWOHj3qbm9vd5eUlPSP59yEqKEC5crb9WM1NR6RdJnT1QAAELfVq1f3dnd3uwoKCvoXLlwYvPXWW4+/5z3vKS4pKSmtqKg4tXjx4t7zHf+FL3yhZdOmTYuXLFlSVlxc3FtaWtotSWvXru1ZsWLFqYsuumhFYWFh/6pVq7oGjrn55pvbrr766pKCgoL+weOiLr/88lM33XTTsbe85S3LJeljH/tY6/r163viDUdj9fjjj8958skn53o8Hpufnx/cvHlzU0FBQfiuu+5q2LBhQ0kkEpHX67Xf/OY368cbooy1dvRW47R69Wq7bdu2ST/vtDi4VXrsWn20/8u678uf1bycDKcrAgAkvnM+TbZz587aysrKNieKweTZuXNnXmVl5aLh9vHpvKEC5ZKkMlPLpJsAAGBEhKihsuYoMnO+Sl112tfEuCgAADA8QtQwXIWVqvTUqYrB5QAAYASEqOEEynWhbdShxhanKwEAJK9IJBI5Z6wUkkfsv9+Is5ESooZTWCGXrLzH9qk36PyzeQAASWlPa2trLkEqOUUiEdPa2porac9IbZjiYDixweWlplZvHO1UxfxzHi0EAMB5hUKhW5ubmx9pbm5eITotklFE0p5QKHTrSA0IUcPJXaCwf5ZKQ9FxUYQoAMB4rVq1qkXStU7XgalDMh6OMXIVVqjczSf0AADA8AhRIzCFFSoxh7W/8aTTpQAAgAREiBpJoEJ+9au3uVpTMas7AABIboSokcQGly8KHtCREz0OFwMAABINIWokeSWKuP0qc9VqL5NuAgCAIQhRI3F7pHllKuPxLwAAYBiEqPNwFZZrhatOVY3tTpcCAAASDCHqfAorNFNdOt540OlKAABAgiFEnU+gQpI0p7Na7T1Bh4sBAACJhBB1PgVlsjIqc9WqmnFRAABgEELU+fiyFZ5TrFLD4HIAAHA2QtQo3BdEH/9SRYgCAACDEKJGYQIVKlSbDjc0Ol0KAABIIISo0cRmLve27VEoHHG4GAAAkCgIUaMprJQklUQO6WBbt8PFAACAREGIGk12noLZAZUyczkAABiEEDUG7gsqtMJVqyqeoQcAAGIIUWPgKqzURaZRbza0Ol0KAABIEISosQiUy62IQs17na4EAAAkiDGHKGOM2xjzZ2PMs1NZUEKKPf6lqPdNtXT2OlwMAABIBOPpifpbSfumqpCENnuRQt4ZKjV1jIsCAACSxhiijDHzJV0j6ZGpLSdBGSMVlKvMVat9TZ1OVwMAABLAWHui/k3SFyWl7WyTnqKVWu46rOrGE06XAgAAEsCoIcoYs1FSi7X29VHa3WaM2WaM2dbamoKfYguUK1N96mjY73QlAAAgAYylJ2q9pGuNMbWSfiTpKmPM40MbWWsfttauttauzs/Pn+QyE0BhdHD5zJNV6g2GHS4GAAA4bdQQZa39srV2vrV2kaRNkv7LWvvRKa8s0eRdrIjLq+WmTvubGRcFAEC6Y56osfL4FJp7sUpNLY9/AQAA4wtR1toXrLUbp6qYROctWqkyV52qGtudLgUAADiMnqhxMIUVmms6dLThkNOlAAAAhxGixiM2c7mnZY8iEetwMQAAwEmEqPEIrJAkLQ4d1JETPQ4XAwAAnESIGg9/jvpmLlKZq1ZVDC4HACCtEaLGyVNUqVJXHSEKAIA0R4gaJ3dhhRaaFtU2NDpdCgAAcBAharwKKyVJ4cZdDhcCAACcRIgar9gn9OZ1v6H2nqDDxQAAAKcQosYrp0D9GXkqc9UxczkAAGmMEBUHG6hQqSFEAQCQzghRcfAVVarEdURvNLQ5XQoAAHAIISoOprBCHoV1qmGP06UAAACHEKLiEfuEXtbxfQqGIw4XAwAAnECIisfsxQq6s3SxPaSDrd1OVwMAABxAiIqHy6VgfplK+YQeAABpixAVJ//8SpWaOlU1nnS6FAAA4ABCVJzcF1RqhunV8cP7nS4FAAA4gBAVr9jM5e6W3bLWOlwMAACYboSoeM1brojx6ML+GrV29jldDQAAmGaEqHh5/OqZVRwdF8XgcgAA0g4hagK8RZUqcxGiAABIR4SoCfAVrdQ8c1JHDtc6XQoAAJhmhKiJKIwOLo807nK4EAAAMN0IURNRsEKSNLezWr3BsMPFAACA6USImojMWTqVPV/LTZ2e3t7gdDUAAGAaEaImKHPBSq3yH9ZXfrGXR8AAAJBGCFETZAorVRhqUGFGSLdv2a7O3qDTJQEAgGlAiJqoBWskSVuW//+qP35KX36aGcwBAEgHhKiJWnyldMnHVLT7AX278qCe3dWkx1+pd7oqAAAwxQhRE2WMdM190oXr9K43N+uWRce1+RdV2n2k3enKAADAFCJETQaPT/rQf8hkz9NdXXdrWXaXbn/idbX3MD4KAIBURYiaLNl50k0/kqu/Sz+ceb+On+zQF59J2h+3AAAWIklEQVTayfgoAABSFCFqMhWUSTf8u7LbdulnC36oX+9t1vdernW6KgAAMAUIUZNt2XulDf+g4qPP6f8U/l7/8qt92l5/wumqAADAJCNETYXLPyeVf1DXn/iePjhjhz7zxJ91orvf6aoAAMAkIkRNBWOka78pFa3S5sj9mtP5hv7uyR2KRBgfBQBAqiBETRVvprTpCbkzZ+mHOd/Qrv01eujFg05XBQAAJsmoIcoYk2GMedUYs9MYs9cY85XpKCwl5ASkDz+h7NAJ/Xj2g/rmb/bo1UPHna4KAABMgrH0RPVJuspaWylppaSrjTFvndqyUsgFl8i870EV9+zWfVk/0GeeeF1tXX1OVwUAACZo1BBlo7piq97YF4N7xmPFDdLbvqj3hH6v63p/ps/+aIfCjI8CACCpjWlMlDHGbYzZIalF0m+tta9MbVkp6O1flpb/hb7k3iLPwd/pW/9V43RFAABgAsYUoqy1YWvtSknzJa0xxqwY2sYYc5sxZpsxZltra+tk15n8XC7p+odkCkr17YwH9PP/el4v17Q5XRUAAIjTuD6dZ609Kel5SVcPs+9ha+1qa+3q/Pz8yaovtfiyZT78I2VkZOkHGffpf/3wRbV09DpdFQAAiMNYPp2Xb4yZFVvOlPQuSdVTXVjKmrVAZtMWXaBj+ufQ1/XZJ15TKBxxuioAADBOY+mJKpT0vDFml6TXFB0T9ezUlpXiLrxMrmu/obVmj/7bkW/o3373ptMVAQCAcRrLp/N2WWsvsdZWWGtXWGu/Oh2FpbyVN0nrPqObPb/ViRe/o+f3tzhdEQAAGAdmLHfSO7+i8EXv0le8P9ATP3pcjSd7nK4IAACMESHKSS633Dd+T5HZF+lfI1/X3f/xSwUZHwUAQFIgRDktY6Z8H/uxsnxufa71H/SNX213uiIAADAGhKhEMGeJfDc9riWuZr3l1c/rN3sanK4IAACMghCVKBa/TZH3/G9d5d6hxqf+hw4fP+V0RQAA4DwIUQnEe9mt6iz/K31cv9BPv/e/1RcKO10SAAAYASEqweS871/VNm+t/qbzfj3+5E+cLgcAAIyAEJVo3F7lffwJdWUEdO3+v9fv//S60xUBAIBhEKISUdYc5fzVT5XlCqvo//6V6hqZiBMAgERDiEpQvsAy9Vz371pq6nXk+zertz/odEkAAGAQQlQCy1t5jQ5e8iWtD/5RLz/y906XAwAABiFEJbil1/4P7cz/C21o+b5ee/YRp8sBAAAxhKhEZ4zKbv137fOWqfy1L+nwnj84XREAABAhKil4/Jmae8uPddzMUuZPP6aeY4edLgkAgLRHiEoS8woXqOm931dmpFst/36jFOxxuiQAANIaISqJrFpzhX63/G4t7N2nukf/WrLW6ZIAAEhbhKgks/GDn9CPZ35cCxt+qZb/e4/T5QAAkLYIUUnG7TK66tav6TlzufJe/Zp6dv/c6ZIAAEhLhKgklD8zQ7M3PazdkcVyPf0J2ebdTpcEAEDaIUQlqcsuLtL2dQ/qRCRT3Y9+UOpuc7okAADSCiEqid387rfq24HN8vS0qvs/PiyF+p0uCQCAtEGISmIul9Hf/uWHdLfnM8puflX9P/8sn9gDAGCaEKKS3Jxsn67/y8/oW+Hr5du1RfZP33a6JAAA0gIhKgWsWjhHvnf+Tz0XvlT21/9Tqvmd0yUBAJDyCFEp4hNvK9YvlvyDqiMLFHry41Lbm06XBABASiNEpQhjjP6/D63V/8q8Ux39LoW3fFDqOeF0WQAApCxCVArJzfLqHz56tf4m9DnZE/WyT35cCoecLgsAgJREiEoxlQtm6T3veZ++FPxrmUMvSL++0+mSAABISYSoFHTzukU6VfohPRK+Rnr1IWnb95wuCQCAlEOISkHGGN3z/gptyblFfzSXyP7q76U9T0u97U6XBgBAyjB2CiZnXL16td22bduknxfjs6ehXX/57d/p5xlf0fxQXXTj7EVSoFwKVMRey6WZRZIxjtYKAEmOP6JpyON0AZg6K4py9YW/uFTvfuYf9bmlbXr33BbN76+R++gead+zkmIBOnP2oGAVC1d5SyW319H6AQBIZPREpThrrf7p53v1+Cv1CkesMrwurVk8V+9YlKErZ7VqUfCgXEd3S827pZYqKdQbPdDtl+YtP7vXqqBMypjp7DcEAImJnqg0RIhKEx29Qb1y8LhermnTyzVterOlS1L0sTHrLpqry4vztH7JLC2INEYDVfOu2Ndu6dSxMyeavVgqrDg7XOUUcjsQQLrjj2AaIkSlqaMdvXq5pk1/iIWqox19kqQL52RpfXGeLi/O07qL5mp2llfqbBoUrGK9VscPnjlZ1twz46sGgtXcpZKbu8UA0gYhKg0RoiBrrQ60dukPb7bpDzXH9MrBY+rsC8kYqeyCmadD1aWL5ijD644e1NshHd17drhqqZLC/dH9ngxpXunZ4aqgTPLPcO4bBYCpQ4hKQ6OGKGPMAkmPSSpQdCTyw9bab5zvGEJUcguFI9rV0K6X34z2VG2vP6Fg2MrncWn1wtlaX5yn9cV5Ki/Klds16O9GOBh9Zt/Q24GnHz9jpDlLzu21yglwOxBAsuOPWBoaS4gqlFRord1ujMmR9Lqk91lrq0Y6hhCVWk71h/TqoeOx23/HtK+pQ5I0M8OjtQPjqYrztDgvW2ZoGLJW6mg4cxtwIFidqD3TJivvzKB1lzf6qUCXJ/rl9ka3udxnlt2x9ZHauj2xYzxn9p9uN3S/5zzn8Y4e7qyVIiEpEpZsOPoaCUk2MmjbwP7IMG2HHheeYNvIoC97Zll2yL5h2oy473xthrQd9X1s7Jqa6KsxknHF1l1D1ocuj7WtK/rP2ZjbmkE1uWLfwzDf+7Dfmz33+sTdbpg2w7aL/ewZnfl+Bn8vZ207z+t42p51fo3tmHH9rI3z5+ycn/U4zv/uzdKaT4z25288CFFpaNy384wxP5P0LWvtb0dqQ4hKbW1dffrjgWP6Y02bXnqzTQ0neyRJF+RmRG/9Lc3TuovylJ/jH/kkve1S854z4aqlSgr2SJFgtEcrEh60HIp+hYPRbdPJuAYFKte5wUjj+/1xlHGd58ucvS4zeptzvkZ7j0Fz+w4bvIZZPx1opqttZITvfZjv7Zx2ZvhrdFa7IaFu2HbDtR/SbqD+oa+nv5dh9knDbI+M0Ha4V41y/iGvLnccP0NjaTN0/9D3GePxF79XWrh2Un/DJvNkSA7jClHGmEWSXpS0wlrbMVI7QlT6sNaq/vip0wPU/3jgmE6eigadZYEcrbsoT5cvnas1i+dqhn8SBppbe6YnZtiQNShsRULRBzCPFMYi4TPL5wS3oftjvUsud+wfh9iryxNbjv0xd3kG7XcN2j/Q1jVov2fIuYaed7i2rmGOG9J22H80+PsOTDF+ydLQmEOUMWaGpK2S/tla+/Qw+2+TdJskXXjhhavq6uoms04kiXDEqqqx43Soeq32uPpCEXlcRpdcOOv0IPXKBbPkdfPUIQApgxCVhsYUoowxXknPSvq1tfa+0drTE4UBvcGwttedOB2qdjW0y1op2+fWW5fMPT1IvaRgxrnjqQAgefAHLA2NZWC5kfQDScettZ8dy0kJURjJyVP9+tPBY7FQdUyH2rolSXkzfCqanaXZWV7NzvJpVux1dpZXs7J8mpM9eJtPmT63w98JAJyFEJWGxhKiLpf0kqTdkiKxzXdaa3810jGEKIxVw8kevVzTplcPHVdLZ59OnurXiVP9OtkdVGdfaMTj/B7X6bA1J9t3VvA6Hbayz4Su2Vk+5WR45HLxdw7AlOCPSxpisk0krP5QRCd7+nXyVFAnuvt14lQwFrKCOnGqf8i2aLuTPUGFI8P/TLuMNGtIL1c0bJ3b8zUQwmZl+uTzMHYLwKgIUWmI53IgYfk8Ls3LydC8nIwxHxOJWHX2hqIhKxasTpw6E7aOd5/Z1nCyV3sbO3TiVL96g5ERzznD79GsLK9mZXmV7fMoJ8OjbL9HM/wezcjwaIYv+prt9yjHH9uXEdsfW872ec6emBQAkPQIUUgpLpdRbpZXuVleLVL2mI/r6Q8PH7y6z/R8tfcE1dUXUuPJXnX3h9TVG1JXX0h9oZED2GBZPvfZQSv2mpNx9nK2z60ZGV7N8Ls1w+9Vtt8daxNdzvZxWxIAEgEhCpCU6XMr05epC2ZljvvYYDii7r6QOmOhqrsvpM7Y60DQGtgeXQ6rqzcayBpO9qirL6juvrC6ekPqD48tkEWDluecUDa0B+z08gj7M71uPhUJAHEiRAET5HW7YmOtfBM+V18orO6+8JhD2eDt9d2nTu/r6g0pNMLYsMFcRiPfhhwllA3tRfN7XAQyAGmFEAUkEL/HLb/HrTnZEwtk1lr1hSKnA9XgcNXdPySg9Z7dU9bZG1Jze++ZY/tDGsvnT7xuczpQzfAPGTs2KHhl+93K9EVvW2b5zixn+qK3KrN8bmX5o71kjCMDkMgIUUAKMsYow+tWhtetvBnneYbhGEQiVj3B8FlBbNjlgZAW6yHr6g3peHe/6o+dOt1jdqo/PK739ntcyo4FqoHwlTVoeXD4yoyFsoHlbL9bmV5PbCza2e19bnrNAEwcIQrAeblcJtaD5FHBBM8VjgWyU7FAFf0KDXk9s9zTH1b3wPa+sE4Fw+rpD6nxZDB6nv7Q6e0jTW0xHLfLRHu8hgSwLJ9HGV6XvG6XfG6XfJ7ostftktdj5D+9HN0ffTXR9p7hjovu83vOPs4XO5/X7ZLHZQh0QJIiRAGYNm6XOX1rbzIN3L7s6Y8GqoGQ1t0fim4bIaQNDmbdfWGd7AmqryOs/nBEwXBEwZCNLoci6g9HvyZ7aj1jNCR8mWHD2OCQluFzK9Mb+/JFexyj666z12PtBnolMwcd5/e4+JQnMEGEKABJb/Dty9lT+D7WWoUjVsFwNFz1h2JhK7YcDV/29PahbYIhq75YKAsO2tc/6Jizz3XmfP3h6Bi3vmBEvaGw+oIR9QSjvXU9wfHdJh2Q4XWdDlUjB7Mh677oMf4R97uV4XUpw+OW3+uS38PYNqQuQhQAjJExRh63kcctZSpxnt84uCeuJxg+Ha56By33BGPr/WH1xAJY76B9PcGwemPL3X0htXX1n7O/f4xzog3ldZuzQpV/UMga+ur3uJQR6ykb+uofYfvg/RmDXj1unjaAqUWIAoAkN109ceGIPSuYDRvSgmH19EfUFwqrLxRRb3D4177TrxEd7+4/q4dt8OtEbp+6XeasUOUfFLZuvXyx/qLygsm7OEhLhCgAwJi4B33IYDpYG711ejpUDQ1ise19obB6x/R65lgvvVSYBIQoAEBCMsbI54kOqtfYH6EJTBuiOAAAQBwIUQAAAHEgRAEAAMSBEAUAABAHQhQAAEAcCFEAAABxIEQBAADEgRAFAAAQB0IUAABAHAhRAAAAcSBEAQAAxIEQBQAAEAdCFAAAQByMtXbyT2pMq6S6STxlnqS2STxfOuIaThzXcOK4hpOD6zhxk30N26y1V0/i+ZAEpiRETTZjzDZr7Wqn60hmXMOJ4xpOHNdwcnAdJ45riMnA7TwAAIA4EKIAAADikCwh6mGnC0gBXMOJ4xpOHNdwcnAdJ45riAlLijFRAAAAiSZZeqIAAAASSkKHKGPM1caY/caYGmPMl5yuJ1kYY75njGkxxuwZtG2OMea3xpg3Y6+znawx0RljFhhjnjfGVBlj9hpj/ja2nes4RsaYDGPMq8aYnbFr+JXY9sXGmFdiv9c/Nsb4nK410Rlj3MaYPxtjno2tcw3HwRhTa4zZbYzZYYzZFtvG7zImLGFDlDHGLekBSe+RVCrpw8aYUmerShqPSho6X8mXJP3eWrtU0u9j6xhZSNLnrbWlkt4q6b/Hfv64jmPXJ+kqa22lpJWSrjbGvFXS1yT9H2ttsaQTkv7awRqTxd9K2jdonWs4fu+w1q4cNK0Bv8uYsIQNUZLWSKqx1h601vZL+pGk6xyuKSlYa1+UdHzI5usk/SC2/ANJ75vWopKMtbbJWrs9ttyp6D9gReI6jpmN6oqtemNfVtJVkp6KbecajsIYM1/SNZIeia0bcQ0nA7/LmLBEDlFFkg4PWj8S24b4FFhrm2LLzZIKnCwmmRhjFkm6RNIr4jqOS+w21A5JLZJ+K+mApJPW2lCsCb/Xo/s3SV+UFImtzxXXcLyspN8YY143xtwW28bvMibM43QBmH7WWmuM4WOZY2CMmSHpp5I+a63tiHYCRHEdR2etDUtaaYyZJekZScscLimpGGM2Smqx1r5ujHm70/UkscuttQ3GmHmSfmuMqR68k99lxCuRe6IaJC0YtD4/tg3xOWqMKZSk2GuLw/UkPGOMV9EAtcVa+3RsM9cxDtbak5Kel7RW0ixjzMD/wPF7fX7rJV1rjKlVdEjDVZK+Ia7huFhrG2KvLYqG+TXidxmTIJFD1GuSlsY+heKTtEnSzx2uKZn9XNLNseWbJf3MwVoSXmzcyXcl7bPW3jdoF9dxjIwx+bEeKBljMiW9S9GxZc9L+kCsGdfwPKy1X7bWzrfWLlL0b+B/WWs/Iq7hmBljso0xOQPLkt4taY/4XcYkSOjJNo0x71V0PIBb0vestf/scElJwRjzQ0lvV/Qp5Ucl/aOk/5T0pKQLJdVJ+qC1dujgc8QYYy6X9JKk3TozFuVORcdFcR3HwBhToeiAXbei/8P2pLX2q8aYJYr2qsyR9GdJH7XW9jlXaXKI3c77grV2I9dw7GLX6pnYqkfSE9bafzbGzBW/y5ighA5RAAAAiSqRb+cBAAAkLEIUAABAHAhRAAAAcSBEAQAAxIEQBQAAEAdCFOAgY0xX7HWRMeamST73nUPW/ziZ5weAdEeIAhLDIknjClGDZqweyVkhylq7bpw1AQDOgxAFJIZ7JF1hjNlhjPlc7MG9/2qMec0Ys8sY80kpOuGiMeYlY8zPJVXFtv1n7MGqewcermqMuUdSZux8W2LbBnq9TOzce4wxu40xHxp07heMMU8ZY6qNMVvM4IcFAgDOwgOIgcTwJcVmo5akWBhqt9ZeaozxS3rZGPObWNu3SFphrT0UW7/FWns89miV14wxP7XWfskY82lr7cph3usGSSslVSo6q/1rxpgXY/sukVQmqVHSy4o+u+0Pk//tAkDyoycKSEzvlvSXxpgdij5qZq6kpbF9rw4KUJJ0hzFmp6Q/KfrQ7qU6v8sl/dBaG7bWHpW0VdKlg859xFobkbRD0duMAIBh0BMFJCYj6TPW2l+ftTH6/LTuIevvlLTWWnvKGPOCpIwJvO/g56+Fxd8IABgRPVFAYuiUlDNo/deS/sYY45UkY0xJ7An0Q+VKOhELUMskvXXQvuDA8UO8JOlDsXFX+ZLeJunVSfkuACCN8H+ZQGLYJSkcuy33qKRvKHorbXtscHerpPcNc9xzkj5ljNknab+it/QGPCxplzFmu7X2I4O2PyNpraSdkqykL1prm2MhDAAwRsZa63QNAAAASYfbeQAAAHEgRAEAAMSBEAUAABAHQhQAAEAcCFEAAABxIEQBAADEgRAFAAAQB0IUAABAHP4f0vfzaIVRa1UAAAAASUVORK5CYII=\n", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plot_collection(trial, \"metrics\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Feature importances\n", + "\n", + "You can also visualize the feature priorities as determined by\n", + "[xgboost.get_score()](https://xgboost.readthedocs.io/en/latest/python/python_api.html#xgboost.Booster.get_score).\n", + "If you instructed Estimator to log the `feature_importance` collection, all five importance types supported by `xgboost.get_score()` will be available in the collection." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def plot_feature_importance(trial, importance_type=\"weight\"):\n", + " SUPPORTED_IMPORTANCE_TYPES = [\"weight\", \"gain\", \"cover\", \"total_gain\", \"total_cover\"]\n", + " if importance_type not in SUPPORTED_IMPORTANCE_TYPES:\n", + " raise ValueError(f\"{importance_type} is not one of the supported importance types.\")\n", + " plot_collection(\n", + " trial,\n", + " \"feature_importance\",\n", + " regex=f\"feature_importance/{importance_type}/.*\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAqcAAAF3CAYAAACYM+K2AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzs3XlcVPX+P/DXZ2YYYGZYBhhWRWQZYMAdFxRETe+1e8VbWWout7pdTX/ZTe1mlnZdKnNJLauby7fbYrmklSl1rfQmuJSJuyggIKiIssi+z8z5/QEYGgrCiCiv5+PhI+fMOe/zGTF4+TmfRUiSBCIiIiKitkB2txtARERERFSH4ZSIiIiI2gyGUyIiIiJqMxhOiYiIiKjNYDglIiIiojaD4ZSIiIiI2gyGUyIiIiJqMxhOiYiIiKjNYDglIiIiojaD4ZSIiIiI2gzF3W7AjYYPHy7t3LnzbjeDiIioNYm73QCitqLN9Zzm5ube7SYQERER0V3S5sIpEREREbVfDKdERERE1GYwnBIRERFRm8FwSkRERERtBsMpEREREbUZDKdERERE1GYwnBIRERFRm8FwSkRERERtBsMpEREREbUZTQqnQojhQogkIUSKEGL2Lc4bJYSQhBBh9Y69XHtdkhDij5ZoNBERERHdnxSNnSCEkAN4H8AwABcBHBJCbJck6fQN59kBeB7AwXrHDADGAggB4AlglxBCL0mSyXIfgYiIiIjuF03pOe0DIEWSpDRJkqoAbALwlwbOew3AEgAV9Y79BcAmSZIqJUk6ByCltl6rOb0xDmWZ2a15SyIiIiJqpqaEUy8AF+q9vlh77BohRE8AHSVJ+vZ2r629frIQIl4IEZ+Tk9OkhjdFQUom9vxUgc2v/4rLaYUWq0tEREREd0aLJ0QJIWQAVgB4obk1JElaK0lSmCRJYTqdrqVNusbR3wuDAzJhLivDV8sO49eYczCbzBarT0RERESW1ZRwmgmgY73XHWqP1bEDEApgjxAiHUA/ANtrJ0U1du0dF/T8eETkfwH3ghM4FHMOXy8/gsKc8tZsAhERERE1UVPC6SEAAUKIzkIIJWomOG2ve1OSpEJJklwkSfKRJMkHwC8ARkqSFF973lghhLUQojOAAAC/WvxT3IJQKuH92qsIPrYWfRyTcDWrDJvf+BWJv2RBkqTWbAoRERERNaLRcCpJkhHANADfAzgD4AtJkhKEEAuFECMbuTYBwBcATgPYCeDZuzFT37ZbN2gnToDmm3fxl4dUcOmgwe6Pz+CHDxNQUVrd2s0hIiIiopsQba33MCwsTIqPj7d4XXNpKdKiR0LY2KDTl1/i2J7LOLTjHFQOSgx9ygAvvdbi9yQiImoicbcbQNRWtJsdomRqNdwXLEBVWhry161F2IM+eGRWL8itZNi28ih+3pYKk5GTpYiIiIjupnYTTgFAExkBh7+MRO7adahISoabjz1Gv9Ibhv4eOLIzA18tO4yCK2V3u5lERERE7Va7eaxfx5ifj7Q//RlWHTvCZ+MGCLkcAJB6NBs/fZYIU7UZEY8FwBDhCSH4lIWIiFoFf+AQ1WpXPacAoNBq4TZnDipOnED+Z59dO+7XwxVj5/aFu68D9nyehP+uPonykqq72FIiIiKi9qfdhVMAsP/zn6CJikL22++g6uJvy65qtNYY+Y/u6D/KHxkJedj02q+4cPrqXWwpERERUfvSLsOpEALu8/4FIQQuz5t33XqnQibQY5g3Hn0pDNa2CmxfdQz7tp6FqZqTpYiIiIjutHYZTgHAytMTuhdmonT/fhRt3/6793Ud7fDYK73RJcoLx3ddwJbF8ci7VHIXWkpERETUfrS7CVH1SWYzMsZPQFVaGny/+xYKZ+cGz0s/kYv/rT+DqgoT+j/ijy6DvDhZioiILIk/VIhqtdueUwAQMhk8XlsIc1kZrix686bn+XR1wdhX+8JLr8Xezcn49v0TKCviZCkiIiIiS2vX4RQArP394TzlGRR9+y2K9+y56XkqeyVGTOuKyDF6XEzKx6bXDiL9ZG7rNZSIiIioHWjXj/XrSFVVODdqFEzFJfCN2QG5RnPL8/MuleDHD08jL7MEXaK80H+UPxRKeSu1loiI7kN8rE9Uq933nAKAUCrh8dprMF65gpwVKxs939lTg8dmh6Hb0I44GZuJLxYdQs6F4lZoKREREdH9jeG0lm337tBOnID8jRtRduRIo+fLrWSIeDQAI//RHZXlRmxdHI+jP56HZG5bPdFERERE9xI+1q/HXFqKtOiREDY26Lzta8iUyiZdV15ShZ/WJ+Lc8Vx0CNLigScM0Git73BriYjoPsLH+kS12HNaj0ythvuC+ahKS0Pe6tVNvs5Wo8SDU7pg0PhAXE4rxKbXDyL1aPYdbCkRERHR/Ynh9AaayEjYj4xG7tp1qEhKbvJ1QgiERHphzJw+sHe2xc41p2rXRjXewdYSERER3V/4WL8Bxvx8pP3pz7Dy7gifDRsg5Lc3E99kNOPXmHM48n0GHFxsMezpELj52N+h1hIR0X2Aj/WJarHntAEKrRZur7yCiuMnkP/557d9vVwhQ/hDfnhoRg+YjGZ8tfQw4v+bDjMnSxERERHdEsPpTdiP+DPUUQORvfJtVF3MbFYNL70WY+b2gW9PHQ5+k4ZtK46gKK/cwi0lIiIiun8wnN6EEAIe8+ZBCIHL8+ejucMfbNRW+MPTIRj6ZDByL5Zg8+uHkHzosoVbS0RERHR/YDi9BStPT+hmzkTpvn0o2r692XWEEAjs54Exc/rAyUOFHz88jR8/SkBlOSdLEREREdXHCVGNkEwmZIyfgKpz5+D73bdQODu3qJ7ZZEb8fzMQ/+05aJxsMOwpAzz8HS3UWiIiukdxQhRRLfacNkLI5fB4/TWYyspwZdGbLa4nk8vQZ0RnPPzPXhAC+Hr5Efwac67ZwwaIiIiI7icMp01g7e8Pl2eeQdG336J4zx6L1PTwc8CYOX3gH+aGQzHnkHokxyJ1iYiIiO5lDKdN5DJ5EqwD/HF5/gKYSkotUlNpq8DQJ4Oh87bD3i+SUcUxqERERNTOMZw2kVAq4b5wIYxXriBnxQqL1ZXJZYgaF4iyoioc3J5msbpERERE9yKG09ug6tED2gkTkL9xI8qOHLFYXTcfe3QZ6IWTey4iO6PIYnWJiIiI7jUMp7fJdfrzUHi4I+vVf8FcVWWxun0f8oOtnRKxG5K4kxQRERG1Wwynt0mmVsNjwQJUpaYib/Uai9W1tlUg4rEAZGcU41Rs83akIiIiIrrXMZw2gyYyEvYjo5G7bh0qkpMtVtc/zBUdDU745ZtUlBZUWqwuERER0b2C4bSZ3F5+GXKNBlmvvgrJZLJITSEEBo7Vw2yUsG/rWYvUJCIiIrqXNCmcCiGGCyGShBApQojZDbw/RQhxUghxTAixTwhhqD3uI4Qorz1+TAix2tIf4G5RaLVwe+UVVBw/gfzPP7dYXUdXFXo92Akp8dk4n5BnsbpERERE94JGw6kQQg7gfQAPAjAAeLwufNazQZKkLpIkdQewFED9tZZSJUnqXvtriqUa3hbYj/gz1AMjkf32O6i6aLlxoj3/0AmObirEbkyCscoyvbJERERE94Km9Jz2AZAiSVKaJElVADYB+Ev9EyRJqr/+kRpAu5huLoSAx/z5AIDL8+dbbAtSuVXN2qdFuRU4vDPDIjWJiIiI7gVNCadeAC7Ue32x9th1hBDPCiFSUdNz+o96b3UWQhwVQsQKISJb1No2yMrTE64zZqB03z4U7dhhsbodArUI7OuOI99nIP+yZXakIiIiImrrLDYhSpKk9yVJ8gPwEoC5tYezAHhLktQDwEwAG4QQ9jdeK4SYLISIF0LE5+Tce3vMa8c9Dtvu3XFl0ZswXr1qsbr9R/nDylqO2A1JFuuVJSIiImrLmhJOMwF0rPe6Q+2xm9kE4CEAkCSpUpKkvNrfHwaQCkB/4wWSJK2VJClMkqQwnU7X1La3GUIuh8drC2EqLcWVNxZZrK7KXonwh/2QmVyA5IOXLVaXiIiIqK1qSjg9BCBACNFZCKEEMBbA9vonCCEC6r38M4Cztcd1tROqIITwBRAA4L7cQN46IAAuzzyDom+/RfGePRaraxjgCXdfe+z/MgUVpdUWq0tERETUFjUaTiVJMgKYBuB7AGcAfCFJUoIQYqEQYmTtadOEEAlCiGOoeXz/RO3xgQBO1B7fCmCKJEmWe+7dxjhPngSlvx8uL1gIU4llxokKmUDUuCBUlBrx89epFqlJRERE1FaJtjaWMSwsTIqPj7/bzWi2sqNHkTFuPLTjxsH91bmNX9BE+79MwbEfz+ORF3vBw8/BYnWJiKhNEHe7AURtBXeIsjBVjx7Qjh+P/A0bUHbkqMXq9v6zDzRaa8RuSITJZLZYXSIiIqK2hOH0DtBNnw6FhzuyXn0V5qoqi9RU2igQOUaPvMxSHN99ofELiIiIiO5BDKd3gFyjhsf8+ahKTUXe6jUWq+vbXQefri44FHMORXnlFqtLRERE1FYwnN4hmoEDYR8djdx161CRnGyxugPH1qzEtXfzWYvVJCIiImorGE7vILeXZ0OuViPr1VchmUwWqWnnZIM+I3yRfiIXacfuvQ0LiIiIiG6F4fQOUjg5wW3OK6g4fgL5n2+wWN2uD3SAs5caezcno6rCaLG6RERERHcbw+kdZj9iBNQDI5H99tuozrzVxlpNJ5fLEDUuCCX5lTgUc84iNYmIiIjaAobTO0wIAY958wAAWfPmw1Lrynr4OcAQ4Ynj/7uI3IvFFqlJREREdLcxnLYCKy8vuE6fjtJ9+1C0Y4fF6oY/7AcbtQJ7Pk+CZG5bmykQERERNQfDaSvRjh8H227dcGXRmzBetcwOrjZqKwwY5Y8r54pwev8li9QkIiIiupsYTluJkMvh8fprMJWW4sqiNy1WV9/XHV56R/z8dSrKiiyz4D8RERHR3cJw2oqsAwLgMnkyimJiULxnj0VqCiEQNS4Q1ZUm7P+Sa58SERHRvY3htJU5PzMZSn8/XF6wEKaSUovU1Lqr0fOPnZB88AouJlpmyAARERHR3cBw2spkSiU8XnsNxsuXkbNypcXq9hreCfY6W8RuTIap2myxukREREStieH0LlD16AHt+PHI37ABZUeOWqSmQilH1Fg9Cq6U4cgPGRapSURERNTaGE7vEt306VC4uyPr1VdhrrLMRCbvEGf4h7ni8H8zUJBdZpGaRERERK2J4fQukWvU8FgwH1Wpqchbs9ZidSMeC4BcIRC3McliC/4TERERtRaG07tIM3Ag7EeMQO7atSg/ecoiNdUO1uj3kB8unMlHSny2RWoSERERtRaG07vMbc4rUDg7I/OFF2AqKbFIzZCBXnDtZId9W86isqzaIjWJiIiIWgPD6V2m0GrhtfwtVGdm4vK/5lnkUbxMVrP2aXlxFQ5+k2aBVhIRERG1DobTNkDVqxd0z01D0XffoWDrVovUdO1kjy6DOuBkXCaunCuySE0iIiKiO43htI1wnjQJqvB+uPLGIlSetcxOT31H+kJtr8SeDYkwm7j2KREREbV9DKdthJDL4bV0KWRqNS7OmAFzeXmLayptFYgYrUfuhRKc3JNpgVYSERER3VkMp22IQqeD55IlqEpJxZVFiyxS06+nDt4hzji4PQ0l+ZUWqUlERER0pzCctjGaiAFwnjQJBVu2ovDbb1tcTwiBgWP1MJsl7NuSbIEWEhEREd05DKdtkO4fz8G2e3dc/tc8VGW0fCtSB50twv7kg9QjOUg/mWuBFhIRERHdGQynbZCwsoLX8rcAuRyZM1+wyPamPYZ5Q+uuQtymZFRXmSzQSiIiIiLLYzhto6y8vOC56A1UJCQgZ/nyFteTK2QYND4QxXkViP8uveUNJCIiIroDGE7bMLuhQ6EdPx5XP/kUxf/7qcX1PAO0CAp3x7EfziPvkmV2oyIiIiKyJIbTNs511ouwDg5G1ssvozorq8X1+j/iDytbOWI3JFlkNyoiIiIiS2I4beNk1tbwWrEc5upqZP7zRUhGY4vq2dop0f8Rf2SlFCLx55aHXSIiIiJLYji9B1h37gyP+fNQfvgwct5/v8X1gsM94OHvgANfpqK8pOWTrYiIiIgspUnhVAgxXAiRJIRIEULMbuD9KUKIk0KIY0KIfUIIQ733Xq69LkkI8UdLNr49cRg5Eg4PP4y81WtQ+vPPLaolZAJRjweiqtyIn79KtVALiYiIiFqu0XAqhJADeB/AgwAMAB6vHz5rbZAkqYskSd0BLAWwovZaA4CxAEIADAfw79p61Azur86FsnNnZM6aBWNuy9YrdfbSoPuwjjhzIAuXzhZYqIVERERELdOUntM+AFIkSUqTJKkKwCYAf6l/giRJRfVeqgHUzbT5C4BNkiRVSpJ0DkBKbT1qBplKBa+VK2AuLMKll2ZDMptbVC/sT51h52SD2I1JMBlbVouIiIjIEpoSTr0AXKj3+mLtsesIIZ4VQqSipuf0H7d57WQhRLwQIj4nJ6epbW+XbAID4fbKyyjdvx95H37YolpW1nIMHKvH1UulOLbrvIVaSERERNR8FpsQJUnS+5Ik+QF4CcDc27x2rSRJYZIkhel0Oks16b7lOGYM7P74R+S8/Q7Kjh5tUS2fri7w7aFD/LfpKMott1ALiYiIiJqnKeE0E0DHeq871B67mU0AHmrmtdQEQgh4vLYQVu7uyHzhBZgKC1tUL3J0AIRMIG5TMtc+JSIioruqKeH0EIAAIURnIYQSNROcttc/QQgRUO/lnwGcrf39dgBjhRDWQojOAAIA/NryZpPc3h5eK1fAmJ2DrLlzWxQqNVob9InujIxTeUg7xmEVREREdPc0Gk4lSTICmAbgewBnAHwhSVKCEGKhEGJk7WnThBAJQohjAGYCeKL22gQAXwA4DWAngGclSTLdgc/RLtl27QrXGTNQ/OMu5G/Y0KJaXQd3gEtHDfZuPouqipYt9E9ERETUXKKtPcYNCwuT4uPj73Yz7hmS2YwLU6ag7Odf4PPFZtgEBze71uVzhfhy6WF0G9wREaMDGr+AiIgsRdztBhC1Fdwh6h4nZDJ4Ll4MuVaLzBkzYS4tbXYt984OCI30womfLiDnfLEFW0lERETUNAyn9wGFkxM8ly1D1fnzuLxwYYtq9XvIFzYaK+z5PBFmc9vqVSciIqL7H8PpfULdtw9cpk5F4TfbUfD1tmbXsVZZIeKxAGRnFOP0Xi6sQERERK2L4fQ+4vL/pkLVuzcuL1yIyrS0ZtcJ6O2GDkFa/LwtDaWFlRZsIREREdGtMZzeR4RcDs+3lkFmY1Mz/rSionl1hEDU44EwVZuxf2uKhVtJREREdHMMp/cZKzc3eC5+E5VJSbiyZEmz6zi6qdBzeCecPXQFF05ftWALiYiIiG6O4fQ+pImKgtNTT6Fg4yYUff9Ds+v0/KM3HFxtEbsxCcZqLk9LREREdx7D6X3KdcZ02HTtiqy5c1F18WKzaiis5Ih6PBCFOeWI25jMgEpERER3HMPpfUoolfBa/hYgSch84QVI1dXNqtMx2Ak9/9gJZw5k4Ys3DiE7o8jCLSUiIiL6DcPpfUzZsSM8Xn8NFcdPIPvtt5tdJ/xhP0T/oxuqK03YuuQwDu5Ig8lotmBLiYiIiGownN7n7IcPh+OYMbj64X9QEhfX7DreBmeMfbUPAvu4If7bdGxdEo+8zBILtpSIiIgIEJLUtnYBCgsLk+Lj4+92M+4r5ooKpI8eA2NODjpv2wYrN9cW1Us7loM9nyeistyIvtG+6D7MGzIZt4UmImoBfhMlqsWe03ZAZmMDr5UrYK6owKVZsyCZWjaxybe7Do/P64vOXV3w89ep+Pqtwyi4Umah1hIREVF7xnDaTlj7+cF97lyUHTyI3NWrW1zPVqPEHyeFYtjTBuRfLsPm13/FiZ8uQDK3rZ54IiIiurcwnLYjDo88DPvoaOS+/2+U/vpri+sJIaDv7Y7H/9UXXoFa7N18Ft+8cwxFeeUWaC0RERG1Rxxz2s6YSkqRPmoUzOXl6PzNNii0WovUlSQJZw5kYd8XZwEBRDwWgOD+HhCCw6iIiJqA3yyJarHntJ2Ra9TwWrkCpvx8ZM1+GZb6x4kQAoYBnhj7ah+4etvhp/WJ+PbfJ1BaWGmR+kRERNQ+MJy2QzYGA1xnzUJJbCyufvyJRWvbu9jiL9N7IGJ0ADIT87FxwUGcPXTFovcgIiKi+xfDaTulnTAemqEPIHvFCpSfPGnR2kIm0G1IR4yZ2weObir88GECdq49hfKSKoveh4iIiO4/HHPajpkKCpD2yCMQMjk6f/0V5HZ2Fr+H2WTG0R/P49cd52CtUmDwhCB07qaz+H2IiO5xHHNKVIvhtJ0rO3IUGRMnwu4Pw+C1YsUdm8CUe7EEuz85jdwLJQgKd0fEaD2sbRV35F5ERPeg333zPXz4sKtCofg/AKHgk066v5gBnDIajX/v1atX9o1vMh20c6qePaD7xz+Qs3IlCvqFQztm9B25j0sHDR59KQzx36Xj8M4MXEzMx5C/BqNjsNMduR8R0b1OoVD8n7u7e7BOp8uXyWRtqyeJqAXMZrPIyckxXL58+f8AjLzxff5LjOA86e9Q9++PK4sWoSIp+Y7dR66Qoe9IX4x6sResrOXY/s4xxG5MQnVly3asIiK6T4XqdLoiBlO638hkMkmn0xWi5qnA799v5fZQGyRkMnguXQKZnR0yZ86EuezObkXq1tkeo1/pjW5DO+JUXCY2vf4rLqUU3NF7EhHdg2QMpnS/qv273WAOZTglAIDCxQVeS5egKi0Nl994487fTylHxKMBeHhmD0CS8PXyI9j/ZQqM1exFJSIias8YTukadf/+cJ48GYVffoXCHTGtck/PAC3GzO2DkEgvHPvxPL5YFI/sjKJWuTcREd3a66+/7urr6xsycuTIzrdzXVJSknL16tV3fFLB9OnTPbdt22b5pWZuYeHCha7FxcWtmp/Wrl2rfemll9ybe316errV8OHDfRs7T6VS9Wjo+Pr16x0PHz5sU//Y7t271WPHju0EANHR0Z31er1hwYIFrv/5z3+0/v7+ITKZrFdcXJyqOe1lOKXr6J6bBtuePXF53jxUpae3yj2VNgoMGheI6Oe6oarciK1LDuPgjjSYTOZWuT8RETXsww8/1P3444/J27dvP3c71509e9Z68+bNtx1OjUbjbZ3/9ttvX3rooYeKb/c+zWU0GrFmzRq3kpKSVs1PO3fudBgxYkSze258fHyqd+7cmdbc67dt2+Z44sQJ2/rHYmJiHIYPH154/vx5xfHjx9XJycmn582bl929e/fyL7/8MiUsLKykufdjOKXrCIUCXm8tA6yskDnzBZirWm/hfO8QZzz+rz7Q93ZD/Lfp2Lo4HnmZzf67TURELTBu3DjvixcvWj/44IMBL730kvtjjz3m06VLl+Dg4GDDZ5995gjU9JD26tUr0GAwBBsMhuAff/xRDQBz5szxio+P1wQFBRkWLFjgumrVKue//vWv3nW1Bw8e7B8TE2MH1PTWTZo0qUNgYKBh9+7dmr1796p69+4dGBISEhwRERGQkZFhdbM2jho1yuejjz7SAoCXl1eXZ5991isoKMgQGhoavG/fPlVERERAx44dQ5cuXaoDgJiYGLuwsLDAQYMG+fv4+ISOGzfO22SqGU62Zs0aJ71ebwgICAiZOnWqV9096rdv9uzZHtnZ2VZRUVH6vn376gFg/Pjx3qGhocH+/v4hM2bM8Ky7zsvLq8uMGTM8DQZDsF6vNxw9etQGAAoLC2WPPvqoj16vN+j1esPHH3/sCABfffWVfffu3YMMBkPwgw8+6FtYWCgDALPZjISEBNWAAQPK9Hq9ITc3V242m+Ho6Nj9vffecwaAhx9+2Ofrr7+2NxqNeOaZZzqEhoYG6/V6w7Jly1zqvk4BAQEhAFBcXCz705/+5Ovn5xcybNgwv65duwbV7+F87rnnvAIDAw3dunULunDhguLHH39U79q1y3Hu3LkdgoKCDAkJCdYAEBcXZxcdHV08dOhQfXZ2tjIoKMiwc+dOTc+ePSu6devWor3LuZQU/Y6Vpyc831yEi//vWWQvewvuc15ptXtbq6ww9CkDfLvrsGdDIr548xD6Rvui+zBvyGRco5qI2qcXtx7vmHy5uFmPSG9G725XtuzRbhdu9v6GDRvOx8bGOsTGxia/8cYbboMHDy7asmVLem5urjwsLCx45MiRRZ6ensa9e/cmq1Qq6eTJk9aPP/6476lTp8688cYbmcuXL3f76aefUgBg1apVzje7T3l5uaxv376l69atu1hZWSn69esX+O2336Z4enoa161bp/3nP//ptWXLlvSmfCZvb++qxMTE008//XTHv/3tbz4HDx5MLC8vl3Xp0iVk1qxZOQBw8uRJ9dGjR0/p9fqqgQMHBnz66afawYMHl8yfP9/r8OHDZ3Q6nTEyMlK/fv16x4kTJxbUbx8AbNy40SU2NjbZw8PDCAArVqzIdHNzMxmNRvTv3z/w4MGDtn379i0HABcXF+Pp06fPLF68WLd48WK3zZs3Z8yePdvD3t7elJycfBoAcnJy5FlZWYpFixZ5xMXFJdvb25vnzJnj/tprr7m99dZbWQcOHFAZDIYymUyGsLCwkl27dmn8/PwqO3ToULlv3z7NtGnT8o4cOaL55JNPzr/99tsuDg4OplOnTp0pLy8XvXv3DoqOji6qv4b5smXLdI6OjqbU1NSEQ4cO2YSHh4fU/1qEh4eXvPvuu5lTpkzp8O677+qWLl2aNXTo0IIRI0YUPvXUU/kAkJWVpVAoFJKzs7Npx44dKSNGjAhITEw83ZSvUVMwnFKD7IYMgXbiROSvXw91v76we+CBVr2/bw8dPPwdsGdDEn7+OhXnjufggScMcHSz6PdmIiJqgj179th///33jqtWrXIHgMrKSpGSkqLs1KlT9dNPP93p9OnTtjKZDBkZGda3W1sul+PJJ5/MB4ATJ05Ynz171nbIkCF6oKbXUKfTVTe11ujRowsAoEuXLmWlpaUyrVZr1mq1ZqVSac7NzZXXvldqMBiqas+/unfvXo2VlZXUr1+/Yk9PTyMAjBkz5mrYT7feAAAgAElEQVRsbKxm4sSJBfXb15BPPvnE6eOPP3YxGo0iJyfH6vjx4zZ14XTcuHH5ANCnT5+y7du3awEgLi7OftOmTdceset0OtPGjRsdUlNTbfr06RMEANXV1aJXr14lABATE2M/fPjwIgCIjIwsiY2N1aSnpyv//ve/Z3/00Ue6c+fOWdnb25vs7e3Nu3btsk9MTFTV3au4uFh++vRpm5CQkIq6+x04cEDz/PPPZwNA7969K/R6/bUleqysrKSxY8cWAkCvXr1Kd+3aZd/QZ/7mm2/shwwZcscmiDCc0k25vvhPlB8+jEuvzEHnL4Og7ODV+EUWZGunxPDJoTh76AriNiVj8+u/IvwRf3SJ8oJgLyoRtSO36uFsDZIkYevWrSk3Pq6dOXOmp6ura/WXX355zmw2w9bWtldD1ysUCsls/m0eQWVl5bVhhUql0qxQKOruI/z9/cuPHTuW2Jx22tjYSAAgk8mgVCqvLcMlk8lQXV0tAPxuJ8TGdkas374bJSYmKt977z232h5X06hRo3wqKiqufba69igUCsloNN70RpIkISIiomjHjh2/G9v7v//9z2H79u0pADBs2LDitWvXul68eLFyyZIlmdu3b9d+9tln2n79+hXX1hHLly8/P2rUqOuCY1JSkvKWH7KWQqGQZDJZ3e9xszbv3LnT4cUXX7zclJrN0aQxp0KI4UKIJCFEihBidgPvzxRCnBZCnBBC7BZCdKr3nkkIcaz213ZLNp7uLJlSCa8Vy4HqaqT+8Y9IHz8BuR98gPKTJyGZW2eykhAC+j7uGPtqX3jqtdi7ORnfvHMMRXnlrXJ/IiICBg8eXLR8+XK3uoC5f/9+WwAoLCyUe3h4VMvlcvz73/92rhu/6eDgYCopKZHXXe/n51eVkJCgMplMSElJsTpx4oS6oft07dq14urVq4pdu3apgZoe2vj4eJuGzm2ukydPqhMTE5Umkwlbt251ioyMLI6MjCw9ePCgXVZWlsJoNGLLli1OgwYNanDSg1qtNtWNB83Pz5fb2tqanZycTBcuXFDs2bPHobH7R0VFFa1cudK17nVOTo580KBBpfHx8ZpTp05ZA0BRUZHsxIkT1nl5eXKTyQR3d3cTAPj7+1fn5+crzp07Z2MwGKrCw8NL3n//ffeoqKgSABg2bFjhBx98oKusrBRATU90UVHRdVkvPDy8ZNOmTVoAOHz4sE1ycvJ1E50aotFoTHV1zGYzzpw5YxseHn7HfhA3Gk6FEHIA7wN4EIABwONCCMMNpx0FECZJUlcAWwEsrfdeuSRJ3Wt//W6LKmrblD4+8Nm8Cc5PPw2pvBw576xC+mOjcbb/AGTOfAEFX29DdfbvtsW1OI3WGiOmdcXgCUHITi/Cptd+xZkDlyBJXJ+aiOhOW7x48SWj0SiCgoIM/v7+IXPnzvUCgOnTp2dv3LjROTAw0JCYmGhja2trBoA+ffqUy+VyKTAw0LBgwQLXYcOGlXTs2LHS398/ZOrUqd4Gg6HB3V5sbGykTZs2pc6ePbtDYGCgISQkxBAbG6ux5GcJDQ0tnTJlirefn1+ot7d35cSJEws6depUPW/evMyoqCh9cHBwSLdu3UonTJjQ4O4wTzzxRO7w4cP1ffv21YeHh5eHhoaW+fn5hY4ePdq37lH8rbz55ptZBQUF8oCAgJDAwEDDd999Z+fp6Wlcs2ZN+tixY331er0hLCws6OTJkzbbt2+3j4qKum41gu7du5d27ty5AgAGDRpUnJ2dbTV06NBiAJgxY0ZuUFBQRZcuXYIDAgJCJk2a1Kmux7jOiy++mJOXl6fw8/MLefnll738/f0rtFrtLRcZHz9+/NVVq1a5BwcHG/bs2aMODQ0tq+thvdGnn37q6Obm1vXYsWPqhx9+OCAiIiKgsT+TG4nGfrgLIcIBzJck6Y+1r18GAEmS3rzJ+T0AvCdJ0oDa1yWSJDX5L1ZYWJgUHx/f1NOplRnz8lB64ABK9+1Dyf4DMOXmAgCsAwOhjhgATUQEbHv1gkzZpCcIzVKUW47dn5zBpbMF8OnijEETgqB2uO1hTkREbcnvHp8eP348vVu3brl3ozH3q5iYGLv6E7XaujFjxnSaPHly7gMPPFBqqZpGoxFVVVVCpVJJCQkJ1n/4wx/0qampp+qGIDRm1qxZHv7+/hWTJ0++6Tjcpjp+/LhLt27dfG483pQxp14A6o91uQig7y3OfxrAf+u9thFCxAMwAlgsSdK2JtyT2iiFszMcoqPhEB0NyWxGZWIiSvbtR+m+fbj66Xpc/fA/ELa2UPfpA3VEBNQRA6D08Wl0TM/tsHexxUMzeuDETxfx87ZUbFx4EAPH6hHQy41jUYmI6L6xefPmDEvXLC4ulkVGRgZWV1cLSZKwcuXKjKYGUwBYunRplqXbdKOm9Jw+CmC4JEl/r309EUBfSZKmNXDuBADTAERJklRZe8xLkqRMIYQvgP8BeECSpNQbrpsMYDIAeHt798rIsPjXglqBqaQUZb/+WtOrum8fqs+fBwBYeXlBHREBTWQEVP36Qa6x3BOa/Mul2P3JGVw5VwS1gxK+PV3h31MHdz9HLj1FRPcS9pzewsSJE70PHTp03Q+PqVOnXnn++efz7labqOVu1nNqscf6QoihAN5FTTBtcBCiEOJjADGSJG292f34WP/+UXX+PEr27UPpvv0o++UXmMvKAIUCtt27QRMRAfWACNiEGCBuMm6lqcwmM1KOZCMlPhvnE67CZDTD1l4Jv+46+PVyhae/A2Ry7jdBRG0awym1Oy0JpwoAyQAeAJAJ4BCAcZIkJdQ7pwdqJkINlyTpbL3jWgBlkiRVCiFcAPwM4C+SJN10oVaG0/uTVFWFsmPHULpvP0r27UXl6TMAALlWC/WAATXjVQcMgEKna9F9qiqMyDiVh9Qj2cg4mQdjtRm2dlbo3F0H/x6u8Ax0hJxBlYjaHoZTaneaPeZUkiSjEGIagO8ByAH8R5KkBCHEQgDxkiRtB7AMgAbAltqxhedrZ+YHA1gjhDCjZmWAxbcKpnT/EkplzTjUPn3gOnMGjLm5KD1woKZndf8BFMXEAACsg4KgiRgAdUQkVD17QNzmxCqljQIBYW4ICHNDdaUJ5xNqgmryr1dweu8lWKsV8O2mg19PV3QI0kKuYFAlIiJqSxrtOW1t7Dltf66bWLV3L8qOHgWMRgiV6trEKk3EAFh16tTsiVXGKhPOn76K1CPZOHciF9UVJlirFOjc1QV+PV3RMdgJcisGVSK6a9hzSu1OS2brE91RQiaDjcEAG4MBLpMn1U6sOlg7sWo/SvbswRUAVh06QB0ZAU1EBFR9+0GuaXAN5wYplHL4dtfBt7sOxmoTLp7JR8qRbKQdz0XiL5ehtJHDpzaoehucoFDKGy9KREREFseuImpz5Bo17IYMgfu//gX/H76H3w/fw+1fr8I6IACF32zHxWenIblfP2RMmIjc1WtQfirhtnasUljVBNGhTxrwt2URGDGtG/x6uiIjIQ//XX0S/3lxH374v1NIPZKN6qpbrktMRHRfe/311119fX1DRo4c2fl2rktKSlKuXr3a6U61q8706dM9t23bZnen71PfwoULXYuLi1s1P61du1b70ksvuTf3+vT0dKvhw4f7NnaeSqXq0dDx9evXOx4+fPi6nbp2796tHjt2bCcAiI6O7qzX6w0LFixwfeaZZzp07tw5RK/XG4YNG+aXm5t72709fKxP9xSpqgplR4/VbgKw77eJVU5O0ERGQhM1EOoBAyB3aHQHud8xmcy4lFSAlKPZSDuag4qSaiiUMnQKdYZfT1d0CnWG0oYPG4jojmiTj/U7d+4csmvXrmQ/P7/q27muuYvdG41G3Gwf+7bAaDSiU6dOXeLj4894eHgYW+u+jzzyiM+MGTOyIyMjG9xZy1JUKlWPsrKyozceHzVqlM+IESMKn3rqqWsL78+YMcOzW7duZUOGDCmJiIgIOn/+/CkA+Oqrr+yjo6OLrKysMHXqVC8A+OCDDzIbut/NHuuz55TuKUKphLpvH7i+MBO+X32FgH174bl0CdT9+6MkNhaZM19Acv8BSJ8wAbnr1qEiKbnJW5zK5TJ0NDhh8PggPLVkAP4yvTuC+nngUkohfvi/BPznxX347+qTSP71MqrKW+17EhHRXTFu3DjvixcvWj/44IMBL730kvtjjz3m06VLl+Dg4GDDZ5995gjU9JD26tUr0GAwBBsMhuAff/xRDQBz5szxio+P1wQFBRkWLFjgumrVKue//vWv3nW1Bw8e7B8TE2MH1ASiSZMmdQgMDDTs3r1bs3fvXlXv3r0DQ0JCgiMiIgIyMjKsbtbGUaNG+Xz00UdaAPDy8ury7LPPegUFBRlCQ0OD9+3bp4qIiAjo2LFj6NKlS3VATWgOCwsLHDRokL+Pj0/ouHHjvE2mmidka9ascdLr9YaAgICQulB1Y/tmz57tkZ2dbRUVFaXv27evHgDGjx/vHRoaGuzv7x8yY8YMz7rrvLy8usyYMcPTYDAE6/V6w9GjR20AoLCwUPboo4/66PV6g16vN3z88ceOQE2o6969e5DBYAh+8MEHfQsLC6/tZZ+QkKAaMGBAmV6vN+Tm5srNZjMcHR27v/fee84A8PDDD/t8/fXX9kajEc8880yH0NDQYL1eb1i2bJlL3dcpICAgBKhZhP9Pf/qTr5+fX8iwYcP8unbtGhQXF6eqa/dzzz3nFRgYaOjWrVvQhQsXFD/++KN6165djnPnzu0QFBRkSEhIsAaAuLg4u+jo6OKhQ4fqs7OzlUFBQYadO3dqHnnkkSIrq5ovWXh4eGlmZuZtbxnZdv95QtQEChcXOIwcCYeRIyGZTCg/cQIlsbEoiYtDzvIVyFm+AgoPD2gGDoQmKgrqfn0hU6karSuTy9AhyAkdgpwQOVaPy6kFSDmSg7Qj2Ug7lgOZQsDb4Ay/njp07uoCa9VNv3cSEbXctmc7Ivt049+8boeroQwPvX/hZm9v2LDhfGxsrENsbGzyG2+84TZ48OCiLVu2pOfm5srDwsKCR44cWeTp6Wncu3dvskqlkk6ePGn9+OOP+546derMG2+8kVm/53TVqlXON7tPeXm5rG/fvqXr1q27WFlZKfr16xf47bffpnh6ehrXrVun/ec//+m1ZcuW9KZ8JG9v76rExMTTTz/9dMe//e1vPgcPHkwsLy+XdenSJWTWrFk5AHDy5En10aNHT+n1+qqBAwcGfPrpp9rBgweXzJ8/3+vw4cNndDqdMTIyUr9+/XrHiRMnFtRvHwBs3LjRJTY2Nrmu53TFihWZbm5uJqPRiP79+wcePHjQtm/fvuUA4OLiYjx9+vSZxYsX6xYvXuy2efPmjNmzZ3vY29ubkpOTTwNATk6OPCsrS7Fo0SKPuLi4ZHt7e/OcOXPcX3vtNbe33nor68CBAyqDwVAmk8kQFhZWsmvXLo2fn19lhw4dKvft26eZNm1a3pEjRzSffPLJ+bffftvFwcHBdOrUqTPl5eWid+/eQdHR0UX1JxMvW7ZM5+joaEpNTU04dOiQTXh4eEj9r0V4eHjJu+++mzllypQO7777rm7p0qVZQ4cOLajfc5qVlaVQKBSSs7OzaceOHSkjRowISExM/N1qTB9//LHLo48+erUpX7v6GE7pviHkcqh69ICqRw+4Tp+O6ivZKN0bh5LYWBTt2IGCzZshlEqo+vSpCauDoqD09m60rkwm4BmghWeAFpGPBeByWiFSj+Qg9Wg20k/kQiYX6BjsVBNUu+lgo2ZQJaL7y549e+y///57x1WrVrkDQGVlpUhJSVF26tSp+umnn+50+vRpW5lMhoyMDOvbrS2Xy/Hkk0/mA8CJEyesz549aztkyBA9UNNrqNPpmjykYPTo0QUA0KVLl7LS0lKZVqs1a7Vas1KpNNeNfezSpUupwWCoqj3/6t69ezVWVlZSv379ij09PY0AMGbMmKuxsbGaiRMnFtRvX0M++eQTp48//tjFaDSKnJwcq+PHj9vUhdNx48blA0CfPn3Ktm/frgWAuLg4+02bNqXVXa/T6UwbN250SE1NtenTp08QAFRXV4tevXqVAEBMTIz98OHDiwAgMjKyJDY2VpOenq78+9//nv3RRx/pzp07Z2Vvb2+yt7c379q1yz4xMVFVd6/i4mL56dOnbUJCQirq7nfgwAHN888/nw0AvXv3rtDr9deGClhZWUljx44tBIBevXqV7tq1y76hz/zNN9/YDxkypOhWX4uXXnrJXS6XS1OmTGE4Japj5eYKx0cfheOjj9aMVT18GCWxNWH1yqJFuLJoEZQ+PtBERUETNRCqsLBG11UVMgEPf0d4+DtiwKP+uJJRVBNUD2cj41QeZLIkeAVp4dejZmUAW7vbfppBRPR7t+jhbA2SJGHr1q0p3bp1q6x/fObMmZ6urq7VX3755Tmz2QxbW9teDV2vUCgkc72Jq5WVldeGFSqVSnPdOFNJkoS/v3/5sWPHEpvTzro94mUyGZRK5bUxXTKZDNXV1QLA75YkbGyJwvrtu1FiYqLyvffec6vtcTWNGjXKp6Ki4tpnq2uPQqGQjEbjTW8kSRIiIiKKduzYce7G9/73v/85bN++PQUAhg0bVrx27VrXixcvVi5ZsiRz+/bt2s8++0zbr1+/4to6Yvny5edHjRp1XXBMSkpq0g8jhUIhyWp3bVQoFLhZm3fu3Onw4osvXr5ZnVWrVjl///33jnv37k2WNWMXSI45pXZBKJVQh4fDbfZL8PvvdzUrAMyZA6sOHZC/cSPO/+1pJPcLx4Vp05C/ZQuqr1xpvKZMwL2zAwaM8sfEN8Lx2Mth6D6sIwqzy7Dn8yR8NGsftq08ilNxmSgrqmqFT0lEdGcMHjy4aPny5W51AXP//v22AFBYWCj38PColsvl+Pe//+1cN37TwcHBVFJScm2Wtp+fX1VCQoLKZDIhJSXF6sSJEw2uBdi1a9eKq1evKnbt2qUGanpo4+PjbRo6t7lOnjypTkxMVJpMJmzdutUpMjKyODIysvTgwYN2WVlZCqPRiC1btjgNGjSopKHr1Wq1qW48aH5+vtzW1tbs5ORkunDhgmLPnj2NzsaNiooqWrlypWvd65ycHPmgQYNK4+PjNadOnbIGgKKiItmJEyes8/Ly5CaTCe7u7iYA8Pf3r87Pz1ecO3fOxmAwVIWHh5e8//777lFRUSUAMGzYsMIPPvhAV1lZKYCanuiioqLrsl54eHjJpk2btABw+PBhm+TkZNvG2qzRaEx1dcxmM86cOWMbHh5e3tC5W7dutX/nnXfcv/vuuxQ7O7umL6VTD3tOqV1SenvDaeIEOE2cAHNZGUp/OYiSuNiantVduwEA1sHB18aq2nbrCiG/+WoYQgi4drKHayd79HvID7kXS5B6JBupR3IQuyEJsRuToOtoBycPNbQeKmjd1XDyUMNeZwuZrHkbCxARtZbFixdfmjx5sndQUJDBbDaLjh07Vv70008p06dPzx41apTfpk2bnIcMGVJoa2trBoA+ffqUy+VyKTAw0DBu3LjcV199Nfv999+v9Pf3D/H3968wGAwNzjq3sbGRNm3alPqPf/zDu7i4WG4ymcTUqVOvhIWFVTR0fnOEhoaWTpkyxTs9Pd2mf//+RXWP7ufNm5cZFRWllyRJDB06tGDChAkFDV3/xBNP5A4fPlzv5uZWdfDgweTQ0NAyPz+/UA8Pj6q6R/G38uabb2Y99dRT3gEBASEymUx65ZVXLj3xxBMFa9asSR87dqxvVVWVAIB58+ZlHj58WBYVFVVc//ru3buX1v0jYNCgQcVvvvmm19ChQ4sBYMaMGbnp6enWXbp0CZYkSTg5OVV/9913qfWvf/HFF3NGjx7t4+fnF+Ln51fh7+9fodVqb7lu4vjx469OnTrVZ/Xq1W7vv/9+RmhoaNnNekRnzpzpXVVVJasbmtGzZ8+SDRs2nG/sz6U+LiVFVI8kSag8exYlsbEojY2r2a3KZILcwQHquqWqIiKg0GqbXO/qpVKkHMnG5dRC5F8uQ2nBb0/FZAoBrZsKWg/1tcCqdVfB0VXFHauI2pc2uZTU/aa5S1zdLWPGjOk0efLk3AceeKDUUjWNRiOqqqqESqWSEhISrP/whz/oU1NTT9UNQWjMrFmzPPz9/SsmT55803G4TcUdooiaQAgBG70eNno9XCZNgqmwEKUHDqBkTyxK9u5FUUwMIJPBtmtXaAZFQTNwIKyDg286ZkkIAWcvDZy9NNeOVZYbkX+5FPlZpcjPKsPVy6XITi9CyuFsoPZbg5AJOOhsoXWvCa51oVXrroaVNXevIiJqDzZv3pxh6ZrFxcWyyMjIwOrqaiFJElauXJnR1GAKAEuXLs2ydJtuxJ5ToiaSzGZUnDp1bVJVxalTAACFq2tNj+rAgVCH97+tbVXrq64yoeBKWU1ovVyGq1k1AbYwuxxm82//n9o52dT0tHqoakNrTXDlKgFE9zT2nN7CxIkTvQ8dOqSpf2zq1KlXnn/++by71SZquZv1nDKcEjWTMTcXJXF7URIXh9J9+2AuKQGsrKAK61WzAsDAKCg7+zQ6E7QxJpMZhdnltaG1FFezymp6Xi+XwVT921hzlb2yppe1tre1rsfV1s6qxW0gojuO4ZTaHYZTojtIqq5G2dGjNWNV4+JQebZmOJOVt/d1k6pkGg1EM5bVaIjZLKE4rwL5WaW4WjdMoLbHtbrit7Ht1irFb8MC6oYIeKih0VoztBK1HQyn1O4wnBK1oqqLmTUbAOyJRenBg5AqaieaCgGZWg2ZvR3kGjvI7Owg12ggs7eH3E4DmcYOcns7yDR2kNlpILerPcfut3OFSnXLUClJEkoLqhoMrRUlv61lbWUtvxZY68azOuhsYedsA6UNh6MTtTKGU2p3OCGKqBUpO3hB+fjj0D7+OMwVFSj79VdUpqbBXFwMU3FxzX9LimEuKkZ1TjbMab+9B9MtV/QA5PKaQNtAcK0fcjV2GjjY2cPPRwN5FzvINB1QJbdFYYkMBXnVNeE1qxQXE/OR9Mv1aynbqK1g52xz7Ze9sw3snG1h51Tze6Xtrb91SJIEmM2QjEbAaIRkNEIymSBVGwFT7WujCZKxGjCZal/Xnlt7nmSqu7bu/WoIuQIKF2coXFwgd3GB3MHBYj3RRETUNjCcEt1hMhubmkf7Awc2eq4kSZDKy2EqLoG5uKgmyJaUwFRUBHNxCcwlxfXeK7kWaKsvXEBFSXHtOSVAI09EFFZWcLO3h2dtyDXZaVGuckcZVCg12aKs2AblxSpkp6uRLjQwi+u/VShM5bCtLoRtVT5sqgpgW5EH64o82JTlwKYsF4qK4pvc2cIUCiica8JqTWB1hsJFV/NaV3vM2RkKnQ4ytZrDGIiI7gEMp0RtiBACQqWCTKUC3Fwbv6ABktkMc2np9b20DYXcouJrYVcUFUGVcwlqIaBTWEHI5RAKBaCQAworVMtVKFc4oFxuhzKhQblCgzJrFcqkjsgz6WHC9ctbWclMUCurobY2Qm1jgtrWDI2tBLVKQGMnoLSWQWZlBaGovY9cDqGo/1oBYaWo2fhAobj2nmQ0wpSXB2NuLow5uTX/zc2FMS8XxpwcVJw5A2NeXoO9z8LG5oYQ63J9kHV2htxFB4WLM2Q2Ft2Qhuie9frrr7v+5z//0YWGhpZt3779d1tr3kxSUpLyp59+0jRnX/XbMX36dM9BgwYVP/TQQ630L2Jg4cKFrjNmzMht7u5HzbF27Vptamqq9ZIlS266ZeitpKenW02ZMqXjzp070251nkql6lFWVnb0xuPr1693NBgMFb169bq2GcLu3bvV69atc9m0aVNGdHR056SkJNvx48fnzps3LxsA5s2b57Zw4cIOly5dOu7h4WG8nfYynBLdZ4RMBnnt4/7WWFxKkiRUlFajOK8CRbkVKM6rQPHVChTnlaMorwLZeRWovnJ9WFTayGuGCdQfNuBoA/vaY9Yqxc17OX19b90esxmmggIYc3NhqguvOb+FWFNuLqozzqP88BGY8hteQ1pmZ1cTWp2dIdfVC7H1hhQoXHRQOGkhrLiEF92/PvzwQ92uXbuS/fz8qhs/+zdnz5613rx5s9PthlOj0Yib7WPfkLfffvvS7dRvKaPRiDVr1rhNmjTpamuG0507dzrMmDEju7nX+/j4VDcWTG9l27ZtjkajsbB+OI2JiXEYPnx44fnz5xXHjx9Xnz9//lTdeykpKVa7d++29/DwaNbe3QynRNQiQgjYapSw1Sjh2sn+d+9LkoTKUiOKr1agKK+8JsTm1YbYvApkJudft7oAUDNZ67qxrs42sHOqCbIarTVUdkqIm2z7KmQyKJycoHByAvT6W7Zdqq6G8Wo+jLk51wfZvLyaYzm5qDyTiNK8/TAXN9AxIwTkjo7Xel/lLi5QdugAZWdfWPv5QunjU9MLTnQPGjdunPfFixetH3zwwYCHH374alpamk1iYqKt0WgUc+bMuTRhwoSCpKQk5bhx4zqXl5fLAOCdd945P2zYsNI5c+Z4paWl2QQFBRkef/zxXK1Wa4qPj1d/+umn5wFg8ODB/i+88MKVESNGFKtUqh7jx4/PiYuLs1+1atV5lUplnjlzZseysjKZVqs1fv755+mdOnVqMByPGjXKZ8SIEYVPPfVUvpeXV5eHHnro6u7dux0UCoW0evXqjNmzZ3tlZGRYP/fcc1dmzZqVExMTYzd//nxPjUZjqtu+dP369eflcjnWrFnjtHz5cve67Us/+OCDTKCmN7GufdHR0fnZ2dlWUVFReq1Wazx48GDy+PHjvY8fP66uqKiQRUdH569cufISAHh5eXUZPXp03vfff+9gNBrF5s2b03r06FFRWFgoe/rpp71PnDihAoBXXnnl0pNPPlnw1Vdf2S9cuNCzqqpKdOrUqXLTpk3pDg4OZrPZjISEBNWAAQPK9Hq94cCBA0lOTpc/7qIAACAASURBVE4mJyen7q+//vqFadOm5T388MM+f/3rX69GR0cXPfvssx32799vV1VVJSZNmpT94osv5iYlJSlHjBgRcPbs2YTi4mLZmDFjfJKSkmx9fX0rrly5YvXee++dHzhwYBkAPPfcc14//PCDg42NjTkmJiYlMTHReteuXY6//PKL3ZIlSzy+/PLL1JCQkMq4uDi7uXPnXgkPDw/Mzs5WBgUFGd5+++3zw4cPL5k2bVrHZcuWXXz00Uf9m/N3j+GUiO4oIQRsNFaw0VhB5233u/clSUJlmfFaWL0xxF5KKURV+fVPhGQyAbWjNTRO1tA4WkOttYHm2uuaAGtrr4TsJgH2WtusrGDl5gqrJgyhMFdUwJibB1NuTu1wgv/P3r2HR1XdewP/rrnfMsnkSi6EQJJJMkkAD+ESJQYQLPaAPS2tIEq9oBRaW6EVtLUWrUUpXg9KFWhFxaNwUKuAHnyMSkBpeQkqJIQACSSEEMg9mfvMnr3eP/bMZBIGCAgkwu/zPPPs29p71iSBfLP22mv5uxf4t33NLfDU7kXX1o8AsbtBRZGUCPXQYVANGwb1sKHB4CqPjaU+sKTPHvvqscHV7dWX9C+dDFOG48kbnqw/2/G33377eGlpaWRpaenhZcuWJUycOLFr06ZNtS0tLfKCgoKcW2+9tSspKUnYuXPnYZ1Ox8vLy9W33377sIqKioPLli1rCJ0mdOXKlTFnex+n0ykbO3asfe3atSfcbjcbN25c1kcffVSdlJQkrF271vTQQw8lb9q0qbYvnyk1NdVTVVVVOXfu3MH33ntv2u7du6ucTqcsPz8/d8mSJc0AUF5erv/mm28qzGaz58Ybb8x88803TRMnTrQ9/vjjyXv37j0YFxcnFBUVmdevXx81Z86cjtD6AcA777wTW1paejhwq/r5559vSEhI8AmCgOuvvz5r9+7d2rFjxzoBIDY2VqisrDy4fPnyuOXLlyds3Lix7pFHHkk0Go2+w4cPVwJAc3OzvLGxUfHUU08l7tix47DRaBQfffTRQU8++WTCs88+27hr1y6dxWJxyGQyFBQU2EpKSgzp6enulJQU95dffml44IEHWr/++mvDG2+8cfzFF1+MjYyM9FVUVBx0Op1s9OjR2dOnT+8K/b/mmWeeiYuKivLV1NQc2LNnj6awsDA39HtRWFhoe+mllxrmz5+f8tJLL8WtWLGicfLkyR2BPwIAoLGxUaFQKHhMTIxvy5Yt1dOmTcusqqqqBIC33norKjEx0VtYWOjsy/csHAqnhJB+xRiDRq+ERh8+vAKA2+H1dxVwwdbuhq3DDVu7C/Z2N5rqrLDta+kxIQEgBVhdlCoYVqWXRgq1/nVd5PkDbPB6Gg1UKclASvI5y4keD7x1dXAfPQbPsaNwHz0KT81RdLz3HrjD0X29iAiohg3tGVyHDYNq8OCL6irARQ63U4Db4YXLJsBl9/Z4ue3SPsHjg86o6v56mKSAbzBp+nVqXC6K8LW3Q2huBkRRGnnCaLykYwOTi7d9+3bjJ598ErVy5cpBAOB2u1l1dbVqyJAh3rlz5w6prKzUymQy1NXVqS/02nK5HHfffXc7AOzfv1995MgR7aRJk8wAIIoi4uLi+tyl4LbbbusAgPz8fIfdbpeZTCbRZDKJKpVKbGlpkfuP2S0Wi8dfvm3nzp0GpVLJx40bZ01KShIAYObMmW2lpaWGOXPmdITWL5w33ngj+vXXX48VBIE1Nzcr9+3bpwmE09mzZ7cDwJgxYxybN282AcCOHTuMGzZsCN5ij4uL873zzjuRNTU1mjFjxmQDgNfrZaNGjbIBwNatW41Tp07tAoCioiJbaWmpoba2VnXfffc1rVu3Lu7YsWNKo9HoMxqNYklJibGqqkoXeC+r1SqvrKzU5ObmBm/H79q1y/Dggw82AcDo0aNdZrM5+B+TUqnks2bN6gSAUaNG2UtKSs68HQbgww8/NE6aNKmr936r1SpbsWLFoC+++OLI+b5X50LhlBAy4Kl1Sqh1SsSmhA+vgX6vtnY37O1ScO0OsW4011tRu78FQq8Ay2QM+kiVP7B2h1h9lBoR0VKQ1UeqIJP3PRzJVCqoMzOhzsw8o47C6dPwHD0qBdejR+E+dhT2XbvQ+cEH3eUUCsiHDAPSzGDJwyDGp8BnSoAvIgYeQR4SNgPBUwhun3WQBgaotQpo9EooVDKcOtoJp/XM3/dqnaJHeA98LQym7hbp8w0j1hsXBKmbRFMzhOazvJqapAfZhDDPTDAGmcE/5m9kpLQ0RkAe4Q+vgfVII2QRRmmcYP+2PCLivOMCf1+cq4XzSuCc4913360eMWKEO3T/b3/726T4+Hjve++9d0wURWi12lHhzlcoFFwMuaPgdruD/6hUKpUY6GfKOWcZGRnOb7/9tupi6hmYI14mk0GlUgX/RchkMni9XgbgjJ+H8/18hNavt6qqKtXLL7+c4G9x9c2YMSPN5XIFP1ugPgqFgguCcNY34pxj/PjxXVu2bDnjobPPP/88cvPmzdUAMGXKFOuaNWviT5w44f7rX//asHnzZtNbb71lGjdunNV/Hfbcc88dnzFjRo/geOjQIdU5P6SfQqHgMv8fgwqFAmer87Zt2yIXL158xsNZBw8eVJ84cUI9fPhwCwCcPn1a9R//8R85u3fvPpiamtrnh6IonBJCvvdC+73GDT57gHU7hO7g2u6GvcMNW5sLtg43WhtsqKtogeDpFWAZoIsMBDS11NIYCLL+bX2UGvIwAdYniL3CpBwungZXXDLc+rFwDfXCNVaAs8sFV4dDKucBRC4DfACO+1+w+V+AnHuhUvig1sigjVAjOsYArTlK6jqhl7pPaHTK7m29Eiqd4owWYsHrg73DA3uHC9Y2/9fCH+ztHW601Nvg6DrzWQalRi61tEapoNNwaOVuaEQ71J5OqO0tUHY2grU2wtcsjaDga20NO7SZ3GSCIi4Oirg4qNPToYiPD24zhVwaTcLaBV+nf0i1ri74urrgs3bBe7weLqsVYmcnxJDW6LDkcn+wNfoDrT+8hgRbmTECcmOk/1gE5IFWW6MRMvUFNwRelSZOnNj13HPPJbz++uvHZTIZvvrqK+0NN9zg7OzslKekpHjkcjlefvnlGJ9/pIzIyEifzWYLNsWnp6d71q5dq/P5fDh27Jhy//79+nDvM3z4cFdbW5uipKREP3nyZLvb7Wbl5eXqgoICV7jyF6O8vFxfVVWlyszM9Lz77rvR9913X3NRUZF9yZIlgxsbGxVxcXHCpk2bon/5y1+GfQBJr9f7Ojs7ZYmJiWhvb5drtVoxOjraV19fr9i+fXtkcXHxOUcOKC4u7nrhhRfiX3vttXpAuq0/YcIE++9+97vUiooKdV5enrurq0tWW1urTE5OFnw+HwYNGuQDgIyMDG97e7vC6/Uyi8XiKSwstK1atWrQs88+exwApkyZ0vnKK6/ETZs2zapWq/n+/fvVaWlpPf4SLSwstG3YsME0ffp06969ezWHDx/Wnu9rZjAYfF1dXTJAas0+ePCgNtxt+zFjxjjb2tr2BbaTk5Pzy8rKDtLT+oQQEkZo94FztcC6HUKPoBZsgW1zoa3RjrrKNgjuXkNVMUi3yqPU4Bxw2aRA6u1dLoRMzoJhUq1TIGqw6YxgqdYwyO3tkLWdgqyxDry+Br5jNXAfPQqxq7thhGm1UA1Ng3pYutRVYNgwqIYOgypqCGTq8F0EFEo5IuO0iIzr+XtJdDiCLZru01ZYT7TC1mSFrc0Fu1WAvYXB2ahCp9yA0+ooeFRGgMkBRPtfZsjhhTbeCW2KAL2OQR+phCFWj4hBkTAOjkFUWjw0UZemRZMLQveQaZ1dUqDtssJn7fIHWmv3vq5OaeKL003BsMvd7nNen6lUwS4GUquttIy+525o8/O/c/2/L5YvX35y3rx5qdnZ2RZRFNngwYPdX3zxRfXChQubZsyYkb5hw4aYSZMmdWq1WhGQQopcLudZWVmW2bNntzz22GNNq1atcmdkZORmZGS4LBZL2L8qNBoN37BhQ81vfvObVKvVKvf5fGzBggWnL2U4zcvLs8+fPz818EBU4Nb90qVLG4qLi82BB6LuvPPOjnDn33XXXS1Tp041JyQkeHbv3n04Ly/PkZ6enpeYmOgJ3Io/l6effrrxnnvuSc3MzMyVyWT8D3/4w8m77rqrY/Xq1bWzZs0a5vF4GAAsXbq0Ye/evbLeYXfkyJH2wB8BEyZMsD799NPJkydPtgLAokWLWmpra9X5+fk5nHMWHR3t/fjjj2tCz1+8eHHzbbfdlpaenp6bnp7uysjIcJlMpnPO/nLHHXe0LViwIO3VV19NWLVqVV1eXp5Ddhm729D0pYQQcgE45/C4fMEWV3u7G1Z//1dbhxsymRQ61XpFMAyHhs7AfqVaftHhjHMOX2ur1J812LdV6irgbWjoLsgYlCkp3X1b04dBOSgRvo727tvsTU09brGLdvuZb6hUSmPB+ls2Q1+ymFh4dDFwK41wihrYOz0hLdNSuLd3esDFnr9r5EqZv6+rvzU6qrsbgTFWg8g43RXpAyu63VKw7bJC7OqEz2r1jwccJuyGtOIOenwp9IWFl7IqNH3pFbB169aI0Ae1BrqZM2cOmTdvXstNN90U5h/mxREEAR6Ph+l0On7gwAH1zTffbK6pqakIdEE4nyVLliRmZGS45s2bd9Z+uH1F05cSQsglwBiDWquAOtmAmGRDv9UhMKGAfsyYHsdEpxOe2tozgqvj37vPaCVkGo0UMuPjoc7Ohr6o6MwAGh8HeVTUd2rlFEUOZ5enO8T7g7y9XQr4jdWdsHc0QfT1/N2oj1QhMl6HqHitf6lDZLzU2qtQXZrgKlOrIVOroYiNvSTXI+RS2rhxY92lvqbVapUVFRVleb1exjnHCy+8UNfXYAoAK1asaLzUdeqNWk4JIeQawEUR3pONEE6fCvb3lBkMA+ZhIS5yOKxSq2tXixOdTU50NjnQ0eREZ7PjjAe4DCZ1r+AqLSNjtZArv5dP91PL6TnMmTMndc+ePT3+GlywYMHpBx98sLW/6kS+O2o5JYSQaxiTyaBKSZaGwxqApJET1NBHqpGQduboNW6n4A+rDnQ2OYPL6q+b4LZ3P2vBGGCI1pzR2hoVr0NErCbsg2tk4Fu/fv3x/q4DuXIonBJCCBnw1FoF4ocYw85C5rJ7zwitnU0OHP5/p3tM4MBkDBExmjNaW6PitYiI1lzQkGGEkMuHwikhhJDvNY1eiUFDIzFoaGSP/ZxzuGxeqWtAr1bXxurOHqMpyGQMxjit1Moa193aGhmvhSFa0+fJGggh3x2FU0IIIVclxhi0ESpoI1RITD8zuDq6PN2trc3dfVwbDrX3GO9WpmCIjPX3afWH1tTcaBhjzjs8JCHkIvQpnDLGpgL4bwByAH/nnC/vdfy3AO4DIABoBnAv57zOf+wuAH/0F/0L5/yNS1R3Qggh5KIw1t3HNSkzqscxzjkcnZ4zugp0NDlQf7ANPq+I//zlcAqnhFwm5+1gwxiTA1gF4BYAFgC3M8YsvYp9A6CAcz4cwLsAVvjPjQawFMBYAGMALGWMmS5d9QkhhJBLizEGfZQayWYTLOOTcP1PMnDL/Hzc/qex+MV/F+PnT12P5Kxr41fZX/7yl/hhw4bl3nrrrUMv5LxDhw6pXn311ejLVa+AhQsXJn3wwQfhZ9W4TP785z/HW63WK9pBec2aNaaHH3540MWeX1tbq5w6deqw85XT6XTXhdu/fv36qL1792pC93322Wf6WbNmDQGA6dOnDzWbzZYnnngiHgCWLVsWP3To0NyMjIzc+fPnp1xoffvScjoGQDXn/CgAMMY2APgRgMpAAc75FyHl/w3gTv/6DwB8yjlv85/7KYCpAN650IoSQggh/Y3JGCKiNecveJX4xz/+EVdSUnI4PT3de/7S3Y4cOaLeuHFj9Pz589su5DxBEHC2eezDefHFF09eyPW/K0EQsHr16oT777+/LSIiQjz/GZfGtm3bIhctWhR2OtW+SEtL827btu3oxZ7/wQcfRAmC0Dlq1KjgTF1bt26NnDp1aufx48cV+/bt0x8/frwCALZs2RLx0UcfRVVWVlZqtVre0NBwwV1I+5L8kwHUh2yf8O87m7kA/u9CzmWMzWOMlTHGypqbm/tQJUIIIYRcTrNnz049ceKE+pZbbsl8+OGHB/3sZz9Ly8/Pz8nJybG89dZbUYDUQjpq1Kgsi8WSY7FYcj799FM9ADz66KPJZWVlhuzsbMsTTzwRv3Llypif//znqYFrT5w4MWPr1q0RgNRad//996dkZWVZPvvsM8POnTt1o0ePzsrNzc0ZP358Zl1dXfg5eAHMmDEjbd26dSZAmsf9V7/6VXJ2drYlLy8v58svv9SNHz8+c/DgwXkrVqyIA6QZogoKCrImTJiQkZaWljd79uzUwFSgq1evjjabzZbMzMzcBQsWBLNKaP0eeeSRxKamJmVxcbF57NixZgC44447UvPy8nIyMjJyFy1alBQ4Lzk5OX/RokVJFoslx2w2W7755hsNAHR2dsp++tOfppnNZovZbLa8/vrrUQDw/vvvG0eOHJltsVhybrnllmGdnZ3BuewPHDigu+GGGxxms9nS0tIiF0URUVFRI19++eUYAPjxj3+c9s9//tMoCAJ+8YtfpOTl5eWYzWbLM888Exv4PmVmZuYC0iD8P/zhD4elp6fnTpkyJX348OHZO3bs0AXq/etf/zo5KyvLMmLEiOz6+nrFp59+qi8pKYn64x//mJKdnW05cOCAGgB27NgRMX36dOvkyZPNTU1NquzsbMu2bdsMr7zyStySJUsatVot938duofM6KNL+kAUY+xOAAUAii/kPM75GgBrAGkQ/ktZJ0IIIeT77uQfHh3sPnJEd/6SfafOzHQkPbWs/mzH33777eOlpaWRpaWlh5ctW5YwceLErk2bNtW2tLTICwoKcm699daupKQkYefOnYd1Oh0vLy9X33777cMqKioOLlu2rCF0mtCVK1fGnO19nE6nbOzYsfa1a9eecLvdbNy4cVkfffRRdVJSkrB27VrTQw89lLxp06bavnym1NRUT1VVVeXcuXMH33vvvWm7d++ucjqdsvz8/NwlS5Y0A0B5ebn+m2++qTCbzZ4bb7wx88033zRNnDjR9vjjjyfv3bv3YFxcnFBUVGRev3591Jw5czpC6wcA77zzTmxpaenhxMREAQCef/75hoSEBJ8gCLj++uuzdu/erR07dqwTAGJjY4XKysqDy5cvj1u+fHnCxo0b6x555JFEo9HoO3z4cCUANDc3yxsbGxVPPfVU4o4dOw4bjUbx0UcfHfTkk08mPPvss427du3SWSwWh0wmQ0FBga2kpMSQnp7uTklJcX/55ZeGBx54oPXrr782vPHGG8dffPHF2MjISF9FRcVBp9PJRo8enT19+vSu0Mk2nnnmmbioqChfTU3NgT179mgKCwtzQ78XhYWFtpdeeqlh/vz5KS+99FLcihUrGidPntwxbdq0znvuuacdABobGxUKhYLHxMT4tmzZUj1t2rTMqqqqSgBYuHBhamlpacSf/vSnZLVazZ999tn64uJiR1++fwF9CacNAAaHbKf49/XAGJsM4FEAxZxzd8i5E3qdu/1CKkgIIYSQ/rV9+3bjJ598ErVy5cpBAOB2u1l1dbVqyJAh3rlz5w6prKzUymQy1NXVqS/02nK5HHfffXc7AOzfv1995MgR7aRJk8yA1GoYFxfX5y4Ft912WwcA5OfnO+x2u8xkMokmk0lUqVRiS0uL3H/MbrFYPP7ybTt37jQolUo+btw4a1JSkgAAM2fObCstLTXMmTOnI7R+4bzxxhvRr7/+eqwgCKy5uVm5b98+TSCczp49ux0AxowZ49i8ebMJAHbs2GHcsGFD8BZ7XFyc75133omsqanRjBkzJhsAvF4vGzVqlA0Atm7dapw6dWoXABQVFdlKS0sNtbW1qvvuu69p3bp1cceOHVMajUaf0WgUS0pKjFVVVbrAe1mtVnllZaUmNzc3eDt+165dhgcffLAJAEaPHu0ym83B4KhUKvmsWbM6AWDUqFH2kpKSMwcWBvDhhx8aJ02a1BXumM/nY21tbfJvv/22qrS0VDd79uz0+vr6cpms7910+xJO9wDIZIwNhRQ2ZwGYHVqAMXYdgNUApnLOQ/tEfALgqZCHoG4G8Ps+144QQgghOFcL55XAOce7775bPWLECHfo/t/+9rdJ8fHx3vfee++YKIrQarWjwp2vUCi4KHZ30XS73cGkolKpxEA/U845y8jIcH777bdVF1PPwBzxMpkMKpUqeCdWJpPB6/UyAGdM2Xu+KXxD69dbVVWV6uWXX07wt7j6ZsyYkeZyuYKfLVAfhULBBUE46xtxzjF+/PiuLVu2HOt97PPPP4/cvHlzNQBMmTLFumbNmvgTJ064//rXvzZs3rzZ9NZbb5nGjRtn9V+HPffcc8dnzJjRIzgeOnRIdc4P6adQKHggRCoUCpytztu2bYtcvHjxqXDHBg0a5PnpT3/aIZPJMHHiRIdMJuOnTp1SBIJ/X5w3xnLOBQAPQAqaBwH8L+f8AGPsz4yxW/3FngFgALCJMfYtY2yz/9w2AE9CCrh7APw58HAUIYQQQr4fJk6c2PXcc88lBALmV199pQWAzs5OeWJiolcul+Nvf/tbTKD/ZmRkpM9ms8kD56enp3sOHDig8/l8qK6uVu7fv18f7n2GDx/uamtrU5SUlOgBqYW2rKzskj6BVl5erq+qqlL5fD68++670UVFRdaioiL77t27IxobGxWCIGDTpk3REyZMsIU7X6/X+wL9Qdvb2+VarVaMjo721dfXK7Zv3x4Z7pxQxcXFXS+88EJ8YLu5uVk+YcIEe1lZmaGiokINAF1dXbL9+/erW1tb5T6fD4MGDfIBQEZGhre9vV1x7NgxjcVi8RQWFtpWrVo1qLi42AYAU6ZM6XzllVfi3G43A6SW6K6urh5Zr7Cw0LZhwwYTAOzdu1dz+PDh846JZjAYfIHriKKIgwcPagsLC53hyk6fPr3js88+iwi8v9frlQ0aNOiC+p32qc8p5/xjAB/32venkPXJ5zj3NQCvXUilCCGEEDJwLF++/OS8efNSs7OzLaIossGDB7u/+OKL6oULFzbNmDEjfcOGDTGTJk3q1Gq1IgCMGTPGKZfLeVZWlmX27Nktjz32WNOqVavcGRkZuRkZGS6LxRK2D6JGo+EbNmyo+c1vfpNqtVrlPp+PLViw4HRBQYErXPmLkZeXZ58/f35qbW2t5vrrr+8K3LpfunRpQ3FxsZlzziZPntxx5513doQ7/6677mqZOnWqOSEhwbN79+7DeXl5jvT09LzExERP4Fb8uTz99NON99xzT2pmZmauTCbjf/jDH07eddddHatXr66dNWvWMI/HwwBg6dKlDXv37pUVFxdbQ88fOXKkPfBHwIQJE6xPP/108uTJk60AsGjRopba2lp1fn5+DuecRUdHez/++OOa0PMXL17cfNttt6Wlp6fnpqenuzIyMlwmk8mHc7jjjjvaFixYkPbqq68mrFq1qi4vL89xttv0v/nNb1pmzpyZlpmZmatUKsU1a9Ycu5Bb+gDAOB9Yzx8VFBTwsrKy/q4GIYQQciWdcft03759tSNGjGjpj8pcrbZu3RoR+qDWQDdz5swh8+bNa7npppvsl+qagiDA4/EwnU7HDxw4oL755pvNNTU1FYEuCOezZMmSxIyMDNe8efPO2g+3r/bt2xc7YsSItN77afpSQgghhJABaOPGjXWX+ppWq1VWVFSU5fV6GeccL7zwQl1fgykArFixovFS16k3CqeEEEIIGdDmzJmTumfPHkPovgULFpx+8MEHWy/kOtOmTbNOmzbNev6SVy+TySRWVFQc7O96nAuFU0IIIYQMaOvXrz/e33UgV84VnRuWEEIIIYSQc6FwSgghhBBCBgwKp4QQQgghZMCgcEoIIYQQQgYMCqeEEEIICesvf/lL/LBhw3JvvfXWoRdy3qFDh1Svvvpq9OWqV8DChQuTPvjgg4jL/T6h/vznP8dbrdYrmp/WrFljevjhhwdd7Pm1tbXKqVOnDjtfOZ1Od124/evXr4/au3dvj5m6PvvsM/2sWbOGAMD06dOHms1myxNPPBG/a9cu7YgRI7Kzs7MteXl5OV988YXuQutL4ZQQQgghYf3jH/+I+/TTTw9v3rz5jDnfz+XIkSPqjRs3XnA4FYQLmuUSL7744sn/+q//umJDQwmCgNWrVyfYbLYrmp+2bdsWOW3atK6LPT8tLc27bdu2oxd7/gcffBC1f//+HtOcbt26NXLq1Kmdx48fV+zbt09/+PDhyqVLlzYtXrw45dFHHz1ZVVVV+dhjj518+OGHB1/o+1E4JYQQQsgZZs+enXrixAn1Lbfckvnwww8P+tnPfpaWn5+fk5OTY3nrrbeiAKmFdNSoUVkWiyXHYrHkfPrpp3oAePTRR5PLysoM2dnZlieeeCJ+5cqVMT//+c9TA9eeOHFixtatWyMAqbXu/vvvT8nKyrJ89tlnhp07d+pGjx6dlZubmzN+/PjMuro65dnqOGPGjLR169aZACA5OTn/V7/6VXKgxe7LL7/UjR8/PnPw4MF5K1asiAOkGaIKCgqyJkyYkJGWlpY3e/bs1MBUoKtXr442m82WzMzM3AULFiQH3iO0fo888khiU1OTsri42Dx27FgzANxxxx2peXl5ORkZGbmLFi1KCpyXnJycv2jRoiSLxZJjNpst33zzjQYAOjs7ZT/96U/TzGazxWw2W15//fUoAHj//feNI0eOzLZYLDm33HLLsM7OzuBc9gcOHNDdcMMNDrPZbGlpaZGLooioqKiRL7/8cgwA/PjHP0775z//aRQEAb/4xS9S8vLycsxms+WZZ56JDXyfMjMzcwFpHASYhQAAIABJREFUEP4f/vCHw9LT03OnTJmSPnz48OwdO3YEWzd//etfJ2dlZVlGjBiRXV9fr/j000/1JSUlUX/84x9TsrOzLQcOHFADwI4dOyKmT59unTx5srmpqUmVnZ1t2bZtm4Exhs7OTjkAdHR0yBMSEjwX+rNH45wSQgghA9xnbx4c3NZgu+Dbo+cSnWxw3PTznPqzHX/77bePl5aWRpaWlh5etmxZwsSJE7s2bdpU29LSIi8oKMi59dZbu5KSkoSdO3ce1ul0vLy8XH377bcPq6ioOLhs2bKG0GlCV65cGXO293E6nbKxY8fa165de8LtdrNx48ZlffTRR9VJSUnC2rVrTQ899FDypk2bavvymVJTUz1VVVWVc+fOHXzvvfem7d69u8rpdMry8/NzlyxZ0gwA5eXl+m+++abCbDZ7brzxxsw333zTNHHiRNvjjz+evHfv3oNxcXFCUVGRef369VFz5szpCK0fALzzzjuxpaWlhxMTEwUAeP755xsSEhJ8giDg+uuvz9q9e7d27NixTgCIjY0VKisrDy5fvjxu+fLlCRs3bqx75JFHEo1Go+/w4cOVANDc3CxvbGxUPPXUU4k7duw4bDQaxUcffXTQk08+mfDss8827tq1S2exWBwymQwFBQW2kpISQ3p6ujslJcX95ZdfGh544IHWr7/+2vDGG28cf/HFF2MjIyN9FRUVB51OJxs9enT29OnTuxjrnh33mWeeiYuKivLV1NQc2LNnj6awsDA39HtRWFhoe+mllxrmz5+f8tJLL8WtWLGicfLkyR3Tpk3rvOeee9oBoLGxUaFQKHhMTIxvy5Yt1dOmTcusqqqqBID4+Pj6//zP/8x87LHHBouiiC+//LKqL9+7UBROCSGEEHJO27dvN37yySdRK1euHAQAbrebVVdXq4YMGeKdO3fukMrKSq1MJkNdXZ36Qq8tl8tx9913twPA/v371UeOHNFOmjTJDEithnFxcd6+Xuu2227rAID8/HyH3W6XmUwm0WQyiSqVSmxpaZH7j9ktFovHX75t586dBqVSyceNG2dNSkoSAGDmzJltpaWlhjlz5nSE1i+cN954I/r111+PFQSBNTc3K/ft26cJhNPZs2e3A8CYMWMcmzdvNgHAjh07jBs2bAjeYo+Li/O98847kTU1NZoxY8ZkA4DX62WjRo2yAcDWrVuNU6dO7QKAoqIiW2lpqaG2tlZ13333Na1bty7u2LFjSqPR6DMajWJJSYmxqqpKF3gvq9Uqr6ys1OTm5roC77dr1y7Dgw8+2AQAo0ePdpnNZkfgmFKp5LNmzeoEgFGjRtlLSkqM4T7zhx9+aJw0aVLYbgYrV66Me/rpp+vvvvvujr///e+mu+++O23Xrl2Hz/FtOwOFU0IIIWSAO1cL55XAOce7775bPWLECHfo/t/+9rdJ8fHx3vfee++YKIrQarWjwp2vUCi4KIrBbbfbHexWqFKpRIVCEXgflpGR4fz2228vuLUNAAJzxMtkMqhUquB88TKZDF6vlwFAaCtiuO3eQuvXW1VVlerll19O8Le4+mbMmJHmcrmCny1QH4VCwQVBOOsbcc4xfvz4ri1btpzRt/fzzz+P3Lx5czUATJkyxbpmzZr4EydOuP/61782bN682fTWW2+Zxo0bZ/Vfhz333HPHZ8yY0SM4Hjp0SHXOD+mnUCi4TCYLrONsdd62bVvk4sWLT4U79t5778W89tpr9QBw7733ti9cuDCtL+8divqcEkIIIeScJk6c2PXcc88lBALmV199pQWAzs5OeWJiolcul+Nvf/tbTKD/ZmRkpM9ms8kD56enp3sOHDig8/l8qK6uVu7fv18f7n2GDx/uamtrU5SUlOgBqYW2rKxME67sxSovL9dXVVWpfD4f3n333eiioiJrUVGRfffu3RGNjY0KQRCwadOm6AkTJtjCna/X632B/qDt7e1yrVYrRkdH++rr6xXbt2+PPN/7FxcXd73wwgvxge3m5mb5hAkT7GVlZYaKigo1AHR1dcn279+vbm1tlft8PgwaNMgHABkZGd729nbFsWPHNBaLxVNYWGhbtWrVoOLiYhsATJkypfOVV16Jc7vdDJBaoru6unpkvcLCQtuGDRtMALB3717N4cOHezzoFI7BYPAFriOKIg4ePKgtLCx0hisbFxfn/fjjjyMAYMuWLRFDhgxxhSt3LhROCSGEEHJOy5cvPykIAsvOzrZkZGTk/vGPf0wGgIULFza98847MVlZWZaqqiqNVqsVAWDMmDFOuVzOs7KyLE888UT8lClTbIMHD3ZnZGTkLliwINVisTjCvY9Go+EbNmyoeeSRR1KysrIsubm5ltLSUsOl/Cx5eXn2+fPnp6anp+elpqa658yZ0zFkyBDv0qVLG4qLi805OTm5I0aMsN95550d4c6/6667WqZOnWoeO3asubCw0JmXl+dIT0/Pu+2224YFbsWfy9NPP93Y0dEhz8zMzM3KyrJ8/PHHEUlJScLq1atrZ82aNcxsNlsKCgqyy8vLNZs3bzYWFxf3GI1g5MiR9qFDh7oAYMKECdampibl5MmTrQCwaNGiluzsbFd+fn5OZmZm7v333z8k0GIcsHjx4ubW1lZFenp67u9///vkjIwMl8lk8p2rznfccUfbypUrB+Xk5Fi2b9+uz8vLcwRaWHt75ZVX6h5++OGUrKwsy2OPPZb86quv1p3va9Ib45yfv9QVVFBQwMvKyvq7GoQQQsiVdMbt03379tWOGDGipT8qc7XaunVrROiDWgPdzJkzh8ybN6/lpptusl+qawqCAI/Hw3Q6HT9w4ID65ptvNtfU1FQEuiCcz5IlSxIzMjJc8+bNO2s/3L7at29f7IgRI9J676c+p4QQQgghA9DGjRsvuNXxfKxWq6yoqCjL6/UyzjleeOGFur4GUwBYsWJF46WuU28UTgkhhBAyoM2ZMyd1z549PW7vL1iw4PSDDz7YeiHXmTZtmnXatGlXbND+gchkMokVFRUH+7se50LhlBBCCCED2vr164/3dx3IlUMPRBFCCCEDkyiK4rnHOSLke8r/sy2GO0bhlBBCCBmYKpqbmyMpoJKrjSiKrLm5ORJARbjjdFufEEIIGYAEQbjv1KlTfz916lQeqDGJXF1EABWCINwX7iCFU0IIIWQAGjVqVBOAW/u7HoRcafSXGCGEEEIIGTAonBJCCCGEkAGDwikhhBBCCBkwKJwSQgghhJABg8IpIYQQQggZMCicEkIIIYSQAYPCKSGEEEIIGTD6FE4ZY1MZY4cYY9WMsUfCHL+RMfY1Y0xgjP201zEfY+xb/2vzpao4IYQQQgi5+px3EH7GmBzAKgBTAJwAsIcxtplzXhlS7DiAuwE8FOYSTs75yEtQV0IIIYQQcpXrywxRYwBUc86PAgBjbAOAHwEIhlPOea3/mHgZ6kgIIYQQQq4RfbmtnwygPmT7hH9fX2kYY2WMsX8zxv7rgmpHCCGEEEKuKX1pOf2uhnDOGxhjwwB8zhgr55zXhBZgjM0DMA8AUlNTr0CVCCGEEELIQNSXltMGAINDtlP8+/qEc97gXx4FsB3AdWHKrOGcF3DOC+Li4vp6aUIIIYQQcpXpSzjdAyCTMTaUMaYCMAtAn566Z4yZGGNq/3osgBsQ0leVEEIIIYSQUOcNp5xzAcADAD4BcBDA/3LODzDG/swYuxUAGGOjGWMnAPwMwGrG2AH/6TkAyhhj+wB8AWB5r6f8CSGEEEIICWKc8/6uQw8FBQW8rKysv6tBCCGEXEmsvytAyEBBM0QRQgghhJABg8IpIYQQQggZMCicEkIIIYSQAYPCKSGEEEIIGTAonBJCCCGEkAGDwikhhBBCCBkwKJwSQgghhJABg8IpIYQQQggZMCicEkIIIYSQAYPCKSGEEEIIGTAonBJCCCGEkAGDwikhhBBCCBkwKJwSQgghhJABg8IpIYQQQggZMCicEkIIIYSQAYPCKSGEEEIIGTAonBJCCCGEkAGDwikhhBBCCBkwKJwSQgghhJABg8IpIYQQQggZMCicEkIIIYSQAYPCKSGEEEIIGTAonBJCCCGEkAGDwikhhBBCCBkwKJwSQgghhJABg8IpIYQQQggZMCicEkIIIYSQAYPCKSGEEEIIGTAonBJCCCGEkAGDwikhhBBCCBkwKJwSQgghhJABo0/hlDE2lTF2iDFWzRh7JMzxGxljXzPGBMbYT3sdu4sxdsT/uutSVZwQQgghhFx9zhtOGWNyAKsA3ALAAuB2xpilV7HjAO4G8Havc6MBLAUwFsAYAEsZY6bvXm1CCCGEEHI16kvL6RgA1Zzzo5xzD4ANAH4UWoBzXss53w9A7HXuDwB8yjlv45y3A/gUwNRLUG9CCCGEEHIV6ks4TQZQH7J9wr+vL77LuYQQQggh5Bqj6O8KAABjbB6AeQCQmpraz7Uh3zeVrZXYVrsNUeooJBuSkWJIQbIhGZHqSDDG+rt65EL5BMDVATjbz3x5HYBcDSjUgELjX4auawC5que2QgMo/PtkCoB+JgghZEDrSzhtADA4ZDvFv68vGgBM6HXu9t6FOOdrAKwBgIKCAt7Ha5Nr3Nenv8aa8jX4quEryJkcPu7rcVyv1CPZkBx8pUSk9NjWKXX9VPNrhOA5e8h0tIXf7+wA3J2Xr05MdpYAGybkKtRhgnBI0A2G4V5l5EpApgRk8pB1BSBXSEuZ0r9fHnJMSaGZEEL8+hJO9wDIZIwNhRQ2ZwGY3cfrfwLgqZCHoG4G8PsLriUhfpxz/KvxX1i7fy3KTpfBpDbhwf94EDOzZgIATtpO4oTtBBqsDWiwSa96az3+3fhvOAVnj2tFa6J7hNXkiORgy2uiPhFKubI/PuLAI7jPEiTPFjL9gdRjPfs1mQzQRAG6aEBrAgzxQFyWtN7j5T+ujZKWSh3g80h1ElzdS5/7zH2C58LLBD7r2cr53Jfv68xkZw+uge2wxxRnrp/rmEID6GMBfZz0ddfHA4Y46ftBAZkQMgAwzs/fUMkY+yGAFwHIAbzGOV/GGPszgDLO+WbG2GgA/wRgAuACcIpznus/914Af/BfahnnfN253qugoICXlZVd9AciVyeRi9hevx1r969FRWsF4rXxuDvvbszInNGnFlDOOdpcbcHA2mBrwAnrieB6o60RAheC5WVMhnhdfHerqyElGF6TDcmI18VDxgb+MMGcczgFJ2xeG2wem7T02mD32qVtdwds7bWwd9bBbj8NudcJleCBWvBA5XVC7XFC7XNDzTnUnEMlcqg5oPJvq5kMKlUE1Goj1JooqDRRUGtNUGljoNBGgwXCZ++X2gjIBv7X7wyi6A/Hru7lGUHZC4iC9Aq77gVEX89tnxDmmP94j2OB7TDHfP5zw5YLOSa4AYT5f1+u6g6qPZbx3UHWkCCta00UZC89+oIS4tencHolUTgloXyiD5/UfoK15WtR3VGNZEMy5ubPxY/SfwSVXHVJ36fJ0SS1ugYCrL/19YTtBJodzeAhv9CVMiWSDElntLwG+rtGqaO+U3/XHqHSa4PdY4fVaw2GSrvXv+2xB8Nm722bxwa7YIfIew+icSatKELHAc5kcDMGNwOEcAHmAsiYDGq5Giq5CmqZfynvXoau9+VY4KVX6qFT6qBT+F9K6aWRa6iPcV+IIuBsA2ynAVsTYG/2L5sAW7N/2dR9rFd3GQBSy60+rmeADbbCxvdsldWavp9/iFx59MNLiB+FUzIgeX1ebDm6Bf8o/weOW49jWOQw3Jd/H24ZegsUsiv/HJ/H58FJ28nuVtdeXQc63B09yuuV+mB4DQRWvVIvhcZAeAxpzQy33adQqdBCr9TDoDTAoDRAr5LW9XINIrwu6J1dMNhbYOhshKHrJPQ+AQaRw6AywBCbA33iCOiT/gOKpFGAKa1Ha5hP9MEjeuDxeeASXPD4PHD73HCL7uB6cJ/vu+3rfcztc/fp84diYNApddArpPCqVWilbaW+O8T2WgaPhYTc0DJqufraDryiKHVzsDeFCbJNvUJts9Qy25tMAehiewXY3q2yCdK6NvpaDrLX8A8aIT1ROCUDiktw4f0j72PdgXU4ZT+FnOgczBs+D5NSJw3o2+g2j61Hl4FAy2ugJbZ3f1eNXCOFSpWhO1iGbPc+ZlAZukNoSBmFTAG4bcDpCuDkt0Djt9Ky5RAQCHe6WCBpJJA4AkgcKa1HDh7wt2UFUTgj1DoFJ5yCEw6vAw5Betm99u7tXsvAscA5dkHa5n1sFZYzOXQKHbRK7TlDrk7RHXQjNZGI0cQgVhuLGG0MIpQR10bA5dwfZM8VYEOWPs+Z12ByqT+s1gSoDIBKD6gjpKVK799nANSGs2z7y6oNgFL/fQu618APCSF9Q+GUDAh2rx0bD23EmwfeRKurFdfFX4f78+/H+OTx3/tf7IH+rg7BgQhlBPQqPZSyi3zYytUFnCqXQmjjPn8QPYxgH0J9vD+IjuwOpMbkAR9EryTOOVw+F+xeO5xeZ9iQa/fae4Tg8wXg3n98hFLJVIjWRiNWI4XVGG0MYjQxwfXQ/ddUkHV1+oNsmO4Fzg7AYwc8NmnptnZvC66+v49S3x1s1VKQ5Uo9PCodHCotHEo1HAo1HHIFHAolHDI5HIzBwSAtIcLBfdJL9MDhc0s/F4IDvyv4HUYljLqUX5Vr4BtPSN8MiHFOybWr092J/zn4P/ifg/+DLk8XChMLcf/w+1GQUHDV/JJmjEnhAzEXdqKrUwqggRDa+C3QWoNgEI1IlEJo7o+7A6kx8ZLX/2rDGINWoYVWoQW0l+aaIhfhEqTA2+5uR6uzFS3OFrS52tDqbEWrS9o+ZT+FA60H0OZqC9ttQSVT9QivsdpYRGuig+uh+w1Kw/f33whj/hEYooDYzLMWC/uHhNsKh6sVDmcHHO7AywqHxwqH1w6n1w6H4IRdcMLhc0uhUvTCwe1w8i44fBw+F6RHd/tAK4rQiRw6LvXL1oHBCDnkDV8DlzacEkL8KJySftHibMGbB97ExkMb4RAcmDh4Iu7Pvx/5cfn9XbX+4WgDTu3vDqGN+4C2o93HjSlSK+jwmVIITRwBRCT0X31JDzImC/ZZjdPFSeOWnINP9KHD3YFWV+slCbKBLgTRmujgeuj+iwmynHOpa4XY3UfY6/NK6/6+yB6fJ3g87DF/twyv6O2xHrhe77KBfS7BFWyd7msXjDP6HOsioVPqEKPQYXCYrhhahba7v7FMAT1n0HEOnShKL58AjeCGzOs4s/XWYwcSx17Q15MQ0ncUTskV1WhrxLoD6/D+kffhFb34QdoPcF/+fTCbzP1dtSvH3uoPoP7+oY37gI667uORqUDSCGDkHd1B1BDXf/Ull5xcJg/eyr8UQbbR3oiK1orzBtlYbSyMaiMEUegRJoPrIQHR4/P0ORieCwMLjrqgkqugkqmC62q5GkqZElqFFlHqKKjkquD22fr29giXIcdotAZCrh4UTskVUddVh3+U/wNbarYAAKanT8fc/LkYYhzSzzW7jFydQGu1dCu+tRo4fUAKop313WVMaUDSdcCou7tvzeui+6vGZAC61EG209UJhUwBtVwNnVIHlcwfEuXK7hAZJkAG1+XK4DnhAmfoMaVcCQVTUGgkhFwQCqfksjrcfhh/3/93fFL3CZQyJX6W9TPck3sPEg1XSd9IrwtoP+YPodU9w6i9OaQgA6KHAimjgTH3+1tEh0tPJRNyiVxIkCWEkIGKwim5LMqby7GmfA2212+HTqHDXbl34eeWnyNWG9vfVbtwok9q7QwNnoFXRz16zLajjwdiMgDzVGkZeJnSAKWmvz4BIYQQ8r1B4ZRcMpxzlJ0uw9r9a/Gvxn/BqDLilyN+idk5sxGpjuzv6p0b59IwNuFaQNuP9RyTURUBxGYAg8dK/UKj04EY/0szwD8nIYQQMsBROCXfGeccXzZ8ibXla/FN0zeI1kRj0ahFmJk1E3qlvr+r15Or0x86a84Moh5rdzm5CogeJg1zk9WrFVQfR+OGEkIIIZcJhVNy0UQu4rPjn2Ht/rU42HYQg/SD8Psxv8dPMn8CjaIfb2ELbqDtbP1Am0IKMiBqsBQ4B4/xh890aRk5GJDJ++0jEEIIIdcqCqfkggmigP879n/4e/nfcbTzKFIjUvHE9U9g+rDpUMovcuaj7+L0AeCbt4DmQ1IA7azvnroTkFo6YzIA8829+oEOpX6ghBBCyABD4ZT0mcfnwYc1H+K18tdwwnYCGVEZ+GvRX3Fz2s3SHO9XWv0e4MvngUMfAwoNEJctPQ0/4vaQVlDqB0oIIYR8n1A4Jefl8Drw/pH3se7AOjQ5mpAXk4fFoxdjwuAJkDHZla0M58CxUmDnc8CxHdJQTBP+AIydR8MyEUIIIVcBCqcEAOD1eVFvq0d9Vz3quupw3HocdV11qLfWo9HeCJGLKEgowJM3PInCxMIrP6i2KAKH/08KpQ17AcMg4OZl0uD1asOVrQshhBBCLhsKp9cQr+hFg7UhGDyPdx0PrgcCaECEKgJDIoZgeNxwTE+fjuuTrsd18ddd+Ur7BODAP6Xb902V0nih016Ubt1Tf1FCCCHkqkPh9CrjFb04aTvZI3gG1k/aTsLHfcGyBqUBqcZUDI8djmnDpmGIcQgGRwzGEOMQRKmj+nfKQcENfPs28NWLQHstEJcD/GQtkPsTQE4/toQQQsjVin7Lfw8JooBGWyPqrHXBW++BEHrSdhICF4Jl9Uo9UiNSYYmxYGraVAwxDsEQ4xCkGAZDgQhYXQKsLgFdLi+sLgE1HV586+pAl6sFVpcXXU4BVrcXEWol0uP1SI8zID3OgBSTFgr5Zehv6rYBe18H/vUyYG0Ekv4D+MFTgPkWQHaF+7cSQggh5IqjcDpA+UQfGu2NON51HHVWKXgGgugJ2wkIYncAVcu1iFMnw6QagpTocdAiAQoeB5kQB7dLB1urD3UNXlS4BHQ5vbC6TsPqbjhvHbRKOYxaBSI0SnQ4PNhY1j1LklLOkBYjhdVhcf7QGi+tGzUXMZyUsx34f2uBf78CONuAtCLgx68CQ4tpwHtCCCHkGkLh9DLyCAIcXjccXjdcghdOrwdOrxtOrwdunwdOrxduwYNOjxX11no02OrR5DyBVs9JWIVTENF9C55xFWS+OIieGHhdRfC6osE9sRA9sbD6DGhBzwAnlzFEaBwwaryI0Chg1CiRGq2DUasMbkdoFDBqlTAGt5XBMBqhUUDZq2W00+FFTYsNNU02HG2xo6bJhiNNVpQcPA1B7J5fPi5CjfS4QHA1BNeTo7SQyXoFTetp4N+rgD3/ADw2qYW06LfSoPiEEEIIueYwzvn5S11BBQUFvKys7JJcS/D58PqWP6EtIgNOpoDb54HH54XH54UgCvCIHnh9AryiF4Io7RO4f50L8HEBPlGAD9K6yL0QuQ8iBIhcAIcAET5w+MCZAEAAZz4APoD5wNiFfW25qIToiYHoiYHcFw8N4qGXDUKUMhFR6lhEalXnCJaKHsFTp5JfsT6jXp+I+jYHaprtqGm24WizLbje4fAGy6kVMgyN1SM93oDrIrpwU9sGDDn+PpjoBcv9CTB+ETAo74rUmRBCBhi6RUSI31UdTj0dJzDqw1vOW45zBnA5GBQAlwOQg3EFGOSQQQEGBWRMWpcxBeTMv4QCcpkCCqaEQibtVzAlFHIFlDIlFEwJpVwBlUwFpVwJlVwJpUwBtVwFlVwFtVwJtUIJvVKHtMhUDIkahEitOmyr5fdVm92DmmaptbWm2QZnQyXGN72Fyd5SiGB4z3cjVvumwW0c6u/PKoXXYbEGpMfrMcio6d8Hswgh5Mqg/+gI8buqb+urolLwnPkPSP76BRjaayAMnQL3Db+HxjgIOqUKWqUaWoUKKsVV/WXoV9F6FaL10RitqgOOPQ+c3AIoNBDGzkND1r0wOY34mb+VtabZjve/boDV3d2fVqeSd/dpDenfOjRWD41S3o+fjBBCCCGXw1Xdchrk80pDEpWuAFQG4IfPAHkz6EGbK6H2K2ng/JrPAHWkNJPT2PmAPjZscc45mm1u1DQFAqsNR/3htaHDicCPK2NAcpQ2GFqHxukRq1chUqeESaeCSadClE5JAZYQ8n1Bv5AI8bs2wmlAUxXw4a+AhjLpwZtpzwPGpMvzXtcyzoHqEimUHv8XoIsFCn8FjJ77nea5d3l9ONbiD63+8Hq0RVp3en1hz9Eq5YjSKRGlU8HkD67SdmBd2t9dRoVIrRLy3g9uEULI5UX/6RDid22FUwAQfdJwRZ//BZCrgB/8BbhuDrWiXgqiDzi4WQqlp8oBYwpww4PAdXcCKt3le1tRam1ts3vQ4fCiw+FBu8OLdocHHQ5pX3twv7+M0wufGP5nnzHAqFGGD7VaFUz67v1RWmm/Sa+C/go+hEYIuerQfx6E+F174TSgtQbY8iBQuxMYNgGY/t/S1JjkwgkeoPx/gS9fAFqrgZgM6cn7/NsAhaq/axcW5xxWt4AOuxRig6HVH2x7BlwvOpwedNi9PfrD9qaUM0TpVIjSdodZk06FKL0/1IaGXX338avl4TdCyHdC4ZQQv2s3nAKAKAJ71wGfLgW4CExeCoy+n2Yi6iuvE/j6TeCrlUDXCWBQPlD0OyDnVkB2dfb19PrEYIjtcHrRbg+0zEphttPpQbu9O9QGlh6feNZrGtSKkG4GgT6zvYNszxZcg1pBrbSEXF3oHzQhftd2OA3oqAe2LpT6SaYWAre+BMRmXtk6fJ+4OqVB8//9N8DeDAweB9z4EJAxmbpHhME5h9Prk1pi7aGhtWfrbHtoq63dgy7XuVtpI7U9A2uglbZnuO1ej9IpqZWWkIGL/vMkxI/CaQDnwL4NwLZHpBbBib8HCn8NyGmYqSB7i9Rf9/+tBdydQPpNUigdcn1/1+yqJPhEdDq9YboZhIbYC2uljVArggG2d2usSaeCUauAQS21zEZoFDDgCoQsAAAO80lEQVSoFTD4lzTyASGXFYVTQvwonPZmPQ18/Dvg4BYgcSTwo1U0a1HnCeBfq4C9r0vBPWe6NMVo0nX9XTPSC+ccDo/vLK2xYcKtv4z1HK20ASq5LBhUA6E1IiS8Bqa9Pd9xtUJGXRIIORP9oyDEr0/hlDE2FcB/A5AD+DvnfHmv42oAbwIYBaAVwEzOeS1jLA3AQQCH/EX/zTmff6736vdwGnDgA+DjhwBnO1D0kNSXcoA+3HNZiCJQ8zlQ9hpweJu0b/ht0oNOcVn9WzdyyQk+ER1OL2wuATa3AKtLgNXlhc3dvW1zC2c9bnMJsLoFeISzt9oGKGSsR8iNCAmvwUDbI9T6W3M10nqUVgmjlrookKsOhVNC/M4bThljcgCHAUwBcALAHgC3c84rQ8r8EsBwzvl8xtgsAD/mnM/0h9OtnPM+Nz0OmHAKAI426Tb//o1AvAX40ctA8qj+rtXlZWsCvlkvtZJ2HJfGKL3uTqDgXsA0pL9rRwY4t+CD3e2DzSWgKxBeA4E2uO4Nhlmrq/t4dwj2wuU9f8g1qBWI1CqDryidf92/jNKqeu73H4ugh8nIwEQ/lIT49aVD5RgA1ZzzowDAGNsA4EcAKkPK/AjA4/71dwG8zK6G//110cBP1kizSW1ZCPx9MlD4ADDxD4BS29+1u3Q4B47tkFpJq7YCogCkFQGTHweyp19bLcbkO1Er5FAr5IjWf7efGa9PhD3YQhsIr150OQV0Or3odHrR4fD61z3odHpR3WST9ju952zBlcsYjBoFonQqGLVKRPUOt8HtM8Mt9bslhJDLry/hNBlAfcj2CQBjz1aGcy4wxjoBxPiPDWWMfQOgC8AfOec7v1uV+4H5B8Cv/g18+idg10qg6iOpFfX7/iCQvRXY9zZQtg5oqwG0Jmlq0VF302gFpF8p5TL/CAMXF3JdXl8wvHY4PMHQ2hUSajuc3cfrWu3B42eZmwEAoFbIeoXYngE2Sif1q9WrpC4Jev8r0E1Bp5RDRrOPEULIOV3uR9EbAaRyzlsZY6MAfMAYy+Wcd4UWYozNAzAPAFJTUy9zlS6SJlIaqD/3J8DmXwPrbpHGRJ28FFBH9Hft+o5zaUrRsnVA5QeAzyMNBVX8MGD5EaDU9HcNCfnONEo5BkXKMSjywn6eRVGanKFniPUEW2p772/ocKLyZCc6nV7YPeGn0O1Nr5L3CKx6VSDA+vdrFDCoukNtcJ//eGjwVSmo3y0h5OrTl3DaAGBwyHaKf1+4MicYYwoAkQBaudSh1Q0AnPO9jLEaAGYAPTqVcs7XAFgDSH1OL+JzXDnDioFf/gv47Elg96vSw0LT/xvIuKm/a3Zuzg5pqKy964DmKkBtlFpIR90DJFj6u3aEDAgyGQu2ig6OvrBzPYKILpcXVpeA/9/evQfbVZZ3HP8+55ZDI0i5mJFLJiLYDEpzYNCSFh2gityqlGmNRapTO4O2ULWXcZA/6pTRGS8zVDo6zjjKyFRKvSCQsTNiKgiBVgiBhITgJRRQIyTQQIWYnMs+T/9Y7072iUnOOTnnZK+9z/czs2fv9e53r/2eN1lnfudd73rXjjKHds9zo5prO9xgR0t5s86WF3dOKB+ewoVlAAN9PSWo9u4Ora8YLIG2JfTuLlvQx28N9DHY38NAbw8L+nvLcw8L+noY6OspUzN6XFVBUttM5YKoPqoLov6QKoSuAS7PzMda6lwFnNZyQdRlmfmuiDgW2J6ZjYg4CVhd6m3f3/fV6oKoyfzsAVh5NTz/Exi6At7+ierUeF1kwpa11SjpxlthbCccd0Z1cdMbLoOBhe1uoaR9aM653RNs9w67zddV6G3W+c3g22DHyBgHu2LgQG9raJ0YZveE2v0H3Obr3Z9vfb+/hwWt+9hdp4Tj/h4G++bVNIh584NKk5l05LTMIb0auJNqKakbM/OxiLgOeCgzVwJfAf41IjYD24F3l4+/BbguIkaBceCDBwqmHWfx78EHVsM9n4b7b6juMHXJ9bD04va2a/gl2PDN6gKnZzdA/0JYtqIaJT1uqL1tkzSpmc65bTU+Xt2hbEdZMeHXww1GGg2GR8cZHms+GgyPjTPSsr379ej4hPojLfWHR8f51c6xifVb6ow2Zn4irBl8B/t7GSxBdrAE19ay3XWa7/dXIbdZp9reax/9e/Yx2NzH/ArEUi25CP9s+eU6uONq2Lqhmpd60Wdh4TGHtg3PrK9GSTd8E0ZehkWnwZl/Aaf9KQwecWjbImneGx9PRhrjJdg2JoThfYXZPYG5wa7RPc+7Rhv7LKse+64/dqAr2yYxlUD81+eezBmLZ/VMmYlYKrw352w5bgiuvBvu+xzc+xl48h648DPVMlRzOW9rZAds/HY1l3TLWug7rDplf+b7qzVZnTMmqU16eoLBnt6yBFf/If3usUYVdHeNNtg1tifMNsuGmwF3rPX1+G8G3lKnWbZztLoD21RuOCHp4DhyOhe2PQ53XFWFxd+5CC6+Ho549ex+x9ZNVSBd//XqPvfHLq1O2y9bUa95r5KkqXAkQSoMp3NlvAE//CLc9QnoHaguljr9z2c2kjm6CzbdUYXSn/13td9TL61O3S9e7iipJHUuf4FLheF0rv3vE7DyQ/D0fXDSOfBH/zL924A+v7kKpOtuhp0vwFGvrQLpssth4dGTf16SVHeGU6kwnB4K4+NVuFz1j9XyTm/9eLWAf88BFtAeG6luJfrQjfDUaujpg6WXVHNJl7z5wJ+VJHUaw6lUGE4PpRd/Dt/5SLXk1OLl8I7PwzEnT6yz/Ul4+CZ45Guw4zk4cnG1WP7QFXD4orY0W5I05wynUmE4PdQyYf0t8N2PwehOOPfa6n72m/+zGiV94q5q7ujrLqxGSV97nqOkktT9DKdSYThtl5e2wn/8XXXqvm8QxnbBEcfDGe+tLpx65fHtbqEk6dAxnEqF65y2y+GLYMXXYNPt8NNV1XzSU86HXv9JJEnS/GUSaqcIeP0fVw9JkiThZEZJkiTVhuFUkiRJtWE4lSRJUm0YTiVJklQbhlNJkiTVhuFUkiRJtWE4lSRJUm0YTiVJklQbhlNJkiTVhuFUkiRJtWE4lSRJUm0YTiVJklQbhlNJkiTVRmRmu9swQUQ8Bzw9y7s9Bnh+lvc539iHM2cfzpx9OHP24eyY7X58PjMvmMX9SR2rduF0LkTEQ5l5Zrvb0cnsw5mzD2fOPpw5+3B22I/S3PG0viRJkmrDcCpJkqTamC/h9EvtbkAXsA9nzj6cOftw5uzD2WE/SnNkXsw5lSRJUmeYLyOnkiRJ6gBdHU4j4oKI+HFEbI6Ia9rdnk4RETdGxLaI2NhSdlRErIqIn5bn325nG+suIk6MiLsjYlNEPBYRHy7l9uMURcRgRDwYEetLH/5TKX9NRDxQjuuvR8RAu9tadxHRGxGPRMR3yrZ9OA0R8VREbIiIdRHxUCnzWJbmSNeG04joBb4AXAicCvxZRJza3lZ1jK8Ce6+3dw3w/cw8Bfh+2db+jQF/n5mnAmcBV5X/f/bj1A0D52XmMmAIuCAizgI+DfxzZp4MvAD8ZRvb2Ck+DDzesm0fTt+5mTnUsnyUx7I0R7o2nAJvAjZn5v9k5gjw78A729ymjpCZ9wLb9yp+J3BTeX0TcOkhbVSHycxnMvPh8volqmBwPPbjlGXl5bLZXx4JnAd8q5Tbh5OIiBOAi4Evl+3APpwNHsvSHOnmcHo88POW7V+UMh2cRZn5THn9LLConY3pJBGxBDgdeAD7cVrK6eh1wDZgFfAE8GJmjpUqHteT+xzwUWC8bB+NfThdCXwvItZGxJWlzGNZmiN97W6AOk9mZkS4zMMURMQrgFuBj2Tmr6pBq4r9OLnMbABDEXEkcBuwtM1N6igRcQmwLTPXRsQ57W5PBzs7M7dExKuAVRHxo9Y3PZal2dXNI6dbgBNbtk8oZTo4WyPi1QDleVub21N7EdFPFUxvzsxvl2L78SBk5ovA3cBy4MiIaP5h7XF9YH8AvCMinqKa2nQecAP24bRk5pbyvI3qj6Q34bEszZluDqdrgFPKVakDwLuBlW1uUydbCbyvvH4fcEcb21J7ZV7fV4DHM/P6lrfsxymKiGPLiCkRcRjwNqq5u3cDf1Kq2YcHkJkfy8wTMnMJ1e/AuzLzPdiHUxYRCyPi8OZr4HxgIx7L0pzp6kX4I+IiqvlWvcCNmfnJNjepI0TELcA5wDHAVuDjwO3AN4DFwNPAuzJz74umVETE2cBqYAN75vpdSzXv1H6cgoj4XaoLTXqp/pD+RmZeFxEnUY0CHgU8AlyRmcPta2lnKKf1/yEzL7EPp6701W1lsw/4t8z8ZEQcjceyNCe6OpxKkiSps3TzaX1JkiR1GMOpJEmSasNwKkmSpNownEqSJKk2DKeSJEmqDcOp1MEi4uXyvCQiLp/lfV+71/Z/zeb+JUnaF8Op1B2WANMKpy13CNqfCeE0M39/mm2SJGnaDKdSd/gU8OaIWBcRfxsRvRHx2YhYExGPRsQHoFqIPSJWR8RKYFMpuz0i1kbEYxFxZSn7FHBY2d/Npaw5Shtl3xsjYkNErGjZ9w8i4lsR8aOIuLncKUuSpCmbbOREUme4hnL3H4ASMv8vM98YEQuA+yPie6XuGcAbMvPJsv3+zNxebhG6JiJuzcxrIuLqzBzax3ddBgwBy6juIrYmIu4t750OvB74JXA/1b3d75v9H1eS1K0cOZW60/nAeyNiHdUtU48GTinvPdgSTAE+FBHrgR8CJ7bU25+zgVsys5GZW4F7gDe27PsXmTkOrKOabiBJ0pQ5cip1pwD+JjPvnFBY3V99x17bbwWWZ+avI+IHwOAMvrf1/uwN/B0jSZomR06l7vAScHjL9p3AX0VEP0BEvC4iFu7jc68EXijBdClwVst7o83P72U1sKLMaz0WeAvw4Kz8FJKkec9RDak7PAo0yun5rwI3UJ1Sf7hclPQccOk+Pvdd4IMR8TjwY6pT+01fAh6NiIcz8z0t5bcBy4H1QAIfzcxnS7iVJGlGIjPb3QZJkiQJ8LS+JEmSasRwKkmSpNownEqSJKk2DKeSJEmqDcOpJEmSasNwKkmSpNownEqSJKk2DKeSJEmqjf8H+z3Yr7NRzGsAAAAASUVORK5CYII=\n", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plot_feature_importance(trial)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAqEAAAF3CAYAAACVLZLxAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzs3XtUVFeeNv5nV3G/yqW4FHJRoCgOEJIGwSgG0XYGux07M8ZoUHJzNPFNpqPpdHRG+02iMWGcXJ3o0tiZmJgVtCGZ/AjJGycxLd4SIsYgFxHQQABRQW5yp6r27w/AoREUFCmU57OWa1nnnH3293TR+GSfs88WUkoQEREREY0mlbkLICIiIqLxhyGUiIiIiEYdQygRERERjTqGUCIiIiIadQyhRERERDTqGEKJiIiIaNQxhBIRERHRqGMIJSIiIqJRN6QQKoRIFEKcFkKUCiHWXuO4BUIIKYSI7vkcIIRoE0L81PNn+0gVTkRERES3L4vrHSCEUAPYCmAOgEoAx4QQGVLKwn7HOQJ4BkB2v1OckVLePUL1EhEREdEd4LohFEAMgFIp5VkAEELsAfA7AIX9jtsI4N8B/PFmCkpMTJRfffXVzZyCiIjodiPMXQDRaBvK7XgfABV9Plf2bLtCCPErAL5Syi8GaD9JCHFCCJElhJhxvc5qa2uHUBIRERER3c6GMhJ6TUIIFYA3ADw6wO5qAH5SyktCiCgAnwkhwqSUTf3OsQLACgDw8/O72ZKIiIiIaIwbykhoFQDfPp8n9mzr5QggHMABIUQZgKkAMoQQ0VLKDinlJQCQUh4HcAaArn8HUsp3pZTRUspojUZzY1dCRERERLeNoYTQYwCChRCThBBWABYDyOjdKaVslFK6SykDpJQBAL4HMF9KmSOE0PRMbIIQYjKAYABnR/wqiIiIiOi2ct3b8VJKgxDiaQD7AKgB/JeUskAIsQFAjpQy4xrN7wOwQQjRBcAE4EkpZd1IFE5EREREty8hpTR3DX8jOjpa5uTkmLsMIiKi0cTZ8TTucMUkIiIiIhp1DKFERERENOoYQomIiIho1DGEEhEREdGoYwglIiIiolHHEEpEREREo+6ODqFSSlSdrkdTbZu5SyEiIiKiPu7oENre0oWMLT8h99sKc5dCRERERH3c0SHU1sEKk+/W4HT2eRi6jOYuh4iIiIh63NEhFACUOC06Wgw4e6LG3KUQERERUY87PoRODHGBk7sNCg+fM3cpRERERNTjjg+hQiWgxGlRVdyAhgut5i6HiIiIiDAOQigA6O/1hlAJFB7haCgRERHRWDAuQqi9szUCItxQ9F01jAaTucshIiIiGvfGRQgFuicotV3uQtnJWnOXQkRERDTujZsQ6hfmBgcXaxRwghIRERGR2Y2bEKpSCYRO80bFqTquoERERERkZuMmhAJA6HQtAODU0WozV0JEREQ0vo2rEOroagM/xQ2njpyDycgJSkRERETmMq5CKACExWnR0tiJ8oI6c5dCRERENG6NuxDqf5cbbJ2suIISERERkRmNuxCqVqsQOs0b5Xm1aK7vMHc5REREROPSuAuhAKBM94aUQNF3HA0lIiIiModxGUKdNXaYqHdB4ZFqSJM0dzlERERE4864DKFA9wpKly+1o6KIE5SIiIiIRtu4DaGTIzWwsbfkBCUiIiIiMxi3IVRtqULIvV74+adatDZ1mrscIiIionFl3IZQAFCma2EySRR9xxWUiIiIiEbTuA6hrt728A5yRuGRc5CSE5SIiIiIRsu4DqFA9wSlxottOFfcYO5SiIiIiMaNcR9CA3/lAStbCxRwghIRERHRqBn3IdTSSo2QGE+cPVGD9pYuc5dDRERENC4MKYQKIRKFEKeFEKVCiLXXOG6BEEIKIaL7bPvXnnanhRB/PxJFjzRlhhZGgwmnvz9v7lKIiIiIxoXrhlAhhBrAVgBzASgAHhJCKAMc5wjgGQDZfbYpABYDCAOQCGBbz/nGFPeJjvAIcOIEJSIiIqJRMpSR0BgApVLKs1LKTgB7APxugOM2Avh3AO19tv0OwB4pZYeU8mcApT3nG3PC4rSoO9eCCz83mbsUIiIiojveUEKoD4CKPp8re7ZdIYT4FQBfKeUXw207VgRFe8DSWs0JSkRERESj4KYnJgkhVADeAPCHmzjHCiFEjhAip6am5mZLuiFWNhYInuKJ0pwL6GgzmKUGIiIiovFiKCG0CoBvn88Te7b1cgQQDuCAEKIMwFQAGT2Tk67XFgAgpXxXShktpYzWaDTDu4IRpMRpYeg0oeQHTlAiIiIiupWGEkKPAQgWQkwSQlihe6JRRu9OKWWjlNJdShkgpQwA8D2A+VLKnJ7jFgshrIUQkwAEA/hhxK9ihHj4O8JtogNvyRMRERHdYtcNoVJKA4CnAewDcArAX6SUBUKIDUKI+ddpWwDgLwAKAXwF4CkppfHmy741hBAIi9OitqIZF8s5QYmIiIjoVhFj7ZVE0dHRMicnx2z9d7R2YdeaIwiZ6oWZS/Rmq4OIiMYVYe4CiEbbuF8xqT9rO0sERnmg+NgFdLZzghIRERHRrcAQOgAlTouudiNKj180dylEREREdySG0AF4BzrDxcsOhZygRERERHRLMIQOQAgBJU6LCz834VJVs7nLISIiIrrjMIQOImSqF1QWgqOhRERERLcAQ+ggbB2sEHi3Bqezz8PQNWbfKkVERER0W2IIvQYlTouOVgPO/GiepUSJiIiI7lQModfgo3OBk8aWt+SJiIiIRhhD6DUIlYAy3RvnShpQf77F3OUQERER3TEYQq9Df683VCqBwiPV5i6FiIiI6I7BEHod9s7WCLjLHae/r4bRYDJ3OURERER3BIbQIVDitGi73IWfc2vNXQoRERHRHYEhdAh8FVc4uFqj8HCVuUshIiIiuiMwhA6BSiWgTNei4lQ9mmrbzF0OERER0W2PIXSIQqd5Qwig8Ahf10RERER0sxhCh8jBxQZ+4W4oOloNk5ETlIiIiIhuBkPoMCjTtWhp7ER5/iVzl0JERER0W2MIHYaACDfYOVtxBSUiIiKim8QQOgwqtQqh93qjPP8SmuvbzV0OERER0W2LIXSYQqdrISVw6ihXUCIiIiK6UQyhw+SsscVEvQsKj5yDySTNXQ4RERHRbYkh9AYocVo013Wg4lSduUshIiIiui0xhN6AyZEa2DhYcoISERER0Q1iCL0BaksV9FO9UJZbi9amTnOXQ0RERHTbYQi9QUqcFiaTRNF3nKBERERENFwMoTfIxcse2uAJKDx8DlJyghIRERHRcDCE3gQlTovGmjZUFTeYuxQiIiKi2wpD6E0IvEcDazsLTlAiIiIiGiaG0JtgYaWGLtYLZ05cRHtzl7nLISIiIrptMITepLA4LUwGiaLvOUGJiIiIaKgYQm+Sm48DPCc5cYISERER0TAwhI4AJU6L+vOtOH+m0dylEBEREd0WhhRChRCJQojTQohSIcTaAfY/KYTIE0L8JIQ4LIRQerYHCCHaerb/JITYPtIXMBYERXnA0lrNCUpEREREQ3TdECqEUAPYCmAuAAXAQ70hs4+PpZQRUsq7AWwG8EaffWeklHf3/HlypAofS6xsLBAc44nS4xfR0coJSkRERETXM5SR0BgApVLKs1LKTgB7APyu7wFSyqY+H+0BjLuHI8PitDB0mVD8wwVzl0JEREQ05g0lhPoAqOjzubJn298QQjwlhDiD7pHQ3/fZNUkIcUIIkSWEmHFT1Y5hGj9HuPs6oPAIJygRERERXc+ITUySUm6VUgYCWANgfc/magB+Usp7ADwL4GMhhFP/tkKIFUKIHCFETk1NzUiVNKqEEAiL06K2ohk1v1w2dzlEREREY9pQQmgVAN8+nyf2bBvMHgD3A4CUskNKeann78cBnAGg699ASvmulDJaShmt0WiGWvuYExzjBQsrFQo4QYmIiIjomoYSQo8BCBZCTBJCWAFYDCCj7wFCiOA+H38LoKRnu6ZnYhOEEJMBBAM4OxKFj0XWthYIivJAyQ8X0NluMHc5RERERGPWdUOolNIA4GkA+wCcAvAXKWWBEGKDEGJ+z2FPCyEKhBA/ofu2+yM92+8DcLJnezqAJ6WUdSN+FWOIEueDrg4jSo9fNHcpRERERGOWGGuTaKKjo2VOTo65y7hhUkqkbvgBltZqLFwbbe5yiIjo9iDMXQDRaOOKSSOsd4LSxbIm1FY2m7scIiIiojGJIfQWCIn1gspCcAUlIiIiokEwhN4CNg6WCLzHA8U/nIeh02jucoiIiIjGHIbQW0SJ06Kj1YAzP3KCEhEREVF/DKG3iI9uApw1tnxnKBEREdEAGEJvESEElDgtqksbUX++xdzlEBEREY0pDKG3kP5eb6hUnKBERERE1B9D6C1k52SFSZHuKPr+PIxdJnOXQ0RERDRmMITeYkqcFu3NXTibW2PuUoiIiIjGDIbQW8w31BWOrja8JU9ERETUB0PoLSZUAqHTvVFZVI/GmlZzl0NEREQ0JjCEjoLQad4QAig8Um3uUoiIiIjGBIbQUeDgYgP/cDcUHa2G0cgJSkREREQMoaNEidOitakT5XmXzF0KERERkdkxhI4S/3A32DtbcYISERERERhCR41KrYJ+mjd+KbiEy3Xt5i6HiIiIyKwYQkeRMl0LCeDUUU5QIiIiovGNIXQUObnbwjfUFaeOnIPJJM1dDhEREZHZMISOMmW6Fs31HagorDN3KURERERmwxA6yiZFusPW0ZITlIiIiGhcYwgdZWoLFfRTvVF2shYtjR3mLoeIiIjILBhCzUCJ08Jkkij6jhOUiIiIaHxiCDWDCZ520AZPQOHhc5CcoERERETjEEOomShxWjTVtqOyuN7cpRARERGNOoZQMwn8lQbWdhacoERERETjEkOomVhYqhES64WzP9WgrbnT3OUQERERjSqGUDNS4rQwGSROf3/e3KUQERERjSqGUDNy83GA12Qn5B+sQv35FnOXQ0RERDRqGELN7J6/80dTTRs+fjEbezf9gBP/8wsu17WbuywiIiKiW0pIObZeERQdHS1zcnLMXcaoamnoQOnxiyg+dgEXy5oAANrgCQie4onAX2lg62Bl5gqJiOgWE+YugGi0MYSOMQ0XW1GacwHFP1xA/flWqFQCvoorgqd4YlKkO6xsLMxdIhERjTyGUBp3GELHKCklLlU1o/iHCyjJuYDmug5YWKoQEOmO4GhP+Ie5QW3JpymIiO4QDKE07gwphAohEgG8DUAN4M9SypR++58E8BQAI4BmACuklIU9+/4VwLKefb+XUu67Vl8MoVeTJonqs40oOXYBpccvor25C9Z2Fph8jwa6KZ7Q6lygUvH3FxHRbeyqX+LHjx/3sLCw+DOAcHAOB91+TADyDQbDP0dFRV0c6IDrhlAhhBpAMYA5ACoBHAPwUG/I7DnGSUrZ1PP3+QD+j5QyUQihAEgFEANAC+AbADoppXGw/hhCr81oNKGyqB4lP1zA2Z9q0NVhhJ2zFYKjPBE8xRMeAY4QgoGUiOg2c9Uv7tzc3AwvL69QjUbTpFKpxtZtS6LrMJlMoqamxvn8+fOFkZGR8wc6ZigPGMYAKJVSngUAIcQeAL8DcCWE9gbQHvYAev/P8jsAe6SUHQB+FkKU9pzvu2FfDQEA1GoV/MPc4B/mhq5OI8rzLqH4h/PIO1iJ3G8r4KSxhW5KdyB19bY3d7lERHTjwjUaTT0DKN2OVCqV1Gg0jefPnw8f7JihhFAfABV9PlcCiO1/kBDiKQDPArACMKtP2+/7tfUZoO0KACsAwM/PbwglEQBYWqkRFOWBoCgPdLR24cyJGpQcu4Dj/68MOV+Wwd3XAcHR3YHU0dXG3OUSEdHwqBhA6XbW8/M76KMkIzbVWkq5FcBWIUQSgPUAHhlG23cBvAt0344fqZrGE2s7SyjTtVCma9HS2P3Kp5JjF/Ddf5/Bd/99Bt5BztBN8UTgrzxg68hXPhEREZF5DeVB5yoAvn0+T+zZNpg9AO6/wbY0AuydrRE5yxcPrInG0o1TETt/MtpbDMhKLcb7a47g8//Mxenvq9HZbjB3qURENIa9/PLLHpMnTw6bP3/+pOG0O336tNX27dtdb1VdvVatWqX97LPPHG91P31t2LDB4/Lly6M6Uezdd991WbNmjddo9tm/77a2NjFt2jSdXq9Xdu7c6fLKK69o/Pz8woUQUdXV1Tc0qDmURscABAshJqE7QC4GkNT3ACFEsJSypOfjbwH0/j0DwMdCiDfQPTEpGMAPN1Io3RhnjR2ifxOAqLn+uFTVgpJjF1By7AK+2XUKasvTCIhwhy6Gr3wiIqKrvffee5pvvvmmODAwsGs47UpKSqz37t3r+uSTT9YNp53BYICFxdDzzFtvvXVuOOe/WQaDATt27PBcvnx5naOjo2m0+v3qq6+cV69ePeAM81ul97vo7fvo0aN2AFBUVFQIAEeOHLFdsGBB46xZs0JutI/rpg4ppQHA0wD2ATgF4C9SygIhxIaemfAA8LQQokAI8RO6nwt9pKdtAYC/oHsS01cAnrrWzHi6dYQQcJ/ogHv/MRDJL9+Lf/pjFJRp3jhXUo//tz0P//X8YXz74SlUnKqDycQnIoiIxrukpCS/yspK67lz5wavWbPGa+HChQERERGhoaGhykcffTQB6B7xjIqKClEUJVRRlNCvv/7aHgDWrVvnk5OT46DX65WXXnrJY8uWLW4PP/zwlUkfCQkJQZmZmY4AYGdnd8/y5csnhoSEKPv373c4dOiQ3ZQpU0LCwsJC4+LigsvLyy0Hq3HBggUB77//vgsA+Pj4RDz11FM+er1eCQ8PDz18+LBdXFxcsK+vb/jmzZs1AJCZmekYHR0dMnPmzKCAgIDwpKQkP6OxO5bs2LHDVafTKcHBwWErV668Mn+lb31r1671vnjxomV8fLwuNjZWBwBLlizxCw8PDw0KCgpbvXq1tredj49PxOrVq7WKooTqdDrlxIkTNgDQ2NioeuCBBwJ0Op2i0+mUXbt2TQCATz/91Onuu+/WK4oSOnfu3MmNjY0qADCZTCgoKLCbPn1662BtB6p98+bNmieeeGJibz19v4Nt27a5RkREhOr1eiUpKcnfYDBcda379+936O07ICCg87HHHpuUl5dnp9frlYKCAuvp06e3hYSEdN7YT1e3If3nhpTySwBf9tv2f/v8/ZlrtN0EYNONFkgjT6gEvAOd4R3ojLgHg1FZVI/iYxdQ+uNFnDpaDTsnKwRFeyB4iic8A5z4yiciIjP7Y3qub/H5y3YjeU6dl2PrfzwQWTHY/o8//viXrKws56ysrOJNmzZ5JiQkNKWlpZXV1taqo6OjQ+fPn9+k1WoNhw4dKrazs5N5eXnWDz300OT8/PxTmzZtqnr99dc9//rXv5YC3QFosH7a2tpUsbGxLTt37qzs6OgQU6dODfniiy9KtVqtYefOnS7PPfecT1paWtlQrsnPz6+zqKiocNmyZb6PP/54QHZ2dlFbW5sqIiIi7Pnnn68BgLy8PPsTJ07k63S6zvvuuy/4ww8/dElISGh+8cUXfY4fP35Ko9EYZsyYodu9e/eE5OTkhr71AUBqaqp7VlZWsbe3twEA3njjjSpPT0+jwWDAtGnTQrKzs21jY2PbAMDd3d1QWFh4KiUlRZOSkuK5d+/e8rVr13o7OTkZi4uLCwGgpqZGXV1dbfHKK694Hzx4sNjJycm0bt06r40bN3q+9tpr1UePHrVTFKVVpVJhoLZlZWWWA9W+dOnS+qlTp+rRPSkc6enpruvWrav+8ccfbdLT011zcnKKrK2t5dKlS/22b9/u9vTTT1/qf62HDx+2UxSl1dfX17Bt27byvt/pSOAakOOcSq2CX5gb/MLcYOg0oizvEkpyLqDg4Dmc/LYSTu42CJ7iCSVOCyc3W3OXS0REZnDgwAGnffv2TdiyZYsXAHR0dIjS0lIrf3//rmXLlvkXFhbaqlQqlJeXWw/33Gq1Go8++mg9AJw8edK6pKTEdtasWTqgexRQo9EM+VGABx98sAEAIiIiWltaWlQuLi4mFxcXk5WVlam2tlbds69FUZTOnuPrDh065GBpaSmnTp16WavVGgBg0aJFdVlZWQ7JyckNfesbyAcffOC6a9cud4PBIGpqaixzc3NtekNoUlJSPQDExMS0ZmRkuADAwYMHnfbs2XO2t71GozGmpqY6nzlzxiYmJkYPAF1dXSIqKqoZADIzM50SExObBmu7b98+x8Fq9/X17di/f799WFhY+5kzZ2zmzJnTnJKSosnPz7eLjIwMBYD29naVh4eHof930b/vW4EhlK6w6PvKpzYDzp6oQcmx8/jxq3Kc/LYScQuDETrdmyOjRESj7FojlqNBSon09PTSyMjIjr7bn332Wa2Hh0fXJ5988rPJZIKtrW3UQO0tLCykyfS/j1B2dHRceRzQysrK1PscqJRSBAUFtf30009FN1KnjY2NBACVSgUrK6srz5apVCp0dXUJAFf9G3a9f9P61tdfUVGR1TvvvOPZMwppXLBgQUB7e/uVa+utx8LCQhoMhkE7klIiLi6u6fPPP/+5/75vv/3WOSMj44ZGHxcuXFiXmprqotfr2+fOnVuvUqkgpRQLFy68tHXr1qsmive/1pvpeyg4E4UGZG1rgdBp3pj/zD1YuvFeeAQ44a8fFeGLbSfR0thx/RMQEdEdIyEhoen111/37A2SR44csQWAxsZGtbe3d5darca2bdvcep+vdHZ2NjY3N6t72wcGBnYWFBTYGY1GlJaWWp48eXLA1VTuuuuu9rq6OotvvvnGHugecc3JyRnRF13n5eXZFxUVWRmNRqSnp7vOmDHj8owZM1qys7Mdq6urLQwGA9LS0lxnzpzZPFB7e3t7Y+/zmvX19WpbW1uTq6ursaKiwuLAgQPO1+s/Pj6+6c033/To/VxTU6OeOXNmS05OjkN+fr41ADQ1NalOnjxpfenSJbXRaISXl5dxsLbXqn3JkiUN+/btm5CWlua6ZMmSOgBITExsyszMdKmqqrIAgAsXLqiLi4uvendj/75vBYZQui4nd1v87pm7rzw/mrohG6XHR3WSHhERmVFKSso5g8Eg9Hq9EhQUFLZ+/XofAFi1atXF1NRUt5CQEKWoqMjG1tbWBAAxMTFtarVahoSEKC+99JLHnDlzmn19fTuCgoLCVq5c6acoSutA/djY2Mg9e/acWbt27cSQkBAlLCxMycrKchjJawkPD2958skn/QIDA8P9/Pw6kpOTG/z9/bteeOGFqvj4eF1oaGhYZGRky9KlSxsGav/II4/UJiYm6mJjY3X33ntvW3h4eGtgYGD4gw8+OLn3Fvq1vPrqq9UNDQ3q4ODgsJCQEOXLL7901Gq1hh07dpQtXrx4sk6nU6Kjo/V5eXk2GRkZTvHx8Zev1fZatWs0GmNQUFB7VVWVdUJCQisAREVFta9fv75q9uzZOp1Op8yaNUtXUVFx1eSv/n339/LLL3t4enredeHCBavIyEhl0aJF/kP537+v664dP9q4dvzYVn++Bd+8X4iL5ZcRPMUT9y3WwcZ+0ImLREQ0NAOtHV8WGRlZa45i7lSZmZmOIz255lZatGiR/4oVK2pnz57dcrv2nZub6x4ZGRkw0D4+E0rD4uJljwXPR+H4V+XI+aIM50oaMOthPfyUQSc+EhER0Q3Yu3dv+Z3cN2/H07Cp1CpM+e0kLFgTBSsbNT7fkousj0+jq4OvgCUiopGXnJzsp9frlb5/3n777WGPfsybN+/y7TIKOh5wJJRumIe/Ex78tyn4PuMscvdXoOJUHX79mAKvydd9LpuIiGjIdu/e/Yu5a6CRx5FQuikWVmrEPRCM+1ffA5NR4tP/OI7vPjsDo2HUVjMjIiKi2xBDKI0IH50LFv8pBvpp3vjxq3KkpeSgtvK6kwSJiIhonGIIpRFjZWuBWcmh+M3/uQutjR1ISzmGH/eVcy16IiIiugpDKI24SXe546H/G4uACHd8999n8NnrP6KxZsBXwhEREdE4xRBKt4StoxUSV4Tj148puHSuBXtePob8g1UYa++lJSKiwb388ssekydPDps/f/6k4bQ7ffq01fbt211vVV29Vq1apf3ss88cb3U/fW3YsMHj8uXLo5qf3n33XZc1a9Z4jWaf/ftua2sT06ZN0+n1emXnzp0u8+fPnxQQEBAeHBwctnDhwoCOjo5hr+nNEEq3jBACIbFeWPynGHhNckLWx6eR+U4uWhq47CcR0e3gvffe03z99dfFGRkZV61pfi0lJSXWe/fuHXYINRgMwzr+rbfeOnf//fcPuqrPSDMYDNixY4dnc3PzqOanr776ynnevHlNo9ln73fR2/fRo0ftAKCoqKhw+fLl9UuWLKk7e/Zs/unTpwva29vFW2+95T7cPhhC6ZZzdLXB/N/fjfsW63CuuAGpG7JRcuyCucsiIqJrSEpK8qusrLSeO3du8Jo1a7wWLlwYEBERERoaGqp89NFHE4DuEc+oqKgQRVFCFUUJ/frrr+0BYN26dT45OTkOer1eeemllzy2bNni9vDDD/v1njshISEoMzPTEQDs7OzuWb58+cSQkBBl//79DocOHbKbMmVKSFhYWGhcXFxweXn5oMvyLViwIOD99993AQAfH5+Ip556ykev1yvh4eGhhw8ftouLiwv29fUN37x5swboXjEpOjo6ZObMmUEBAQHhSUlJfr3r3e/YscNVp9MpwcHBYStXrvTp7aNvfWvXrvW+ePGiZXx8vC42NlYHAEuWLPELDw8PDQoKClu9erW2t52Pj0/E6tWrtYqihOp0OuXEiRM2ANDY2Kh64IEHAnQ6naLT6ZRdu3ZNAIBPP/3U6e6779YrihI6d+7cyb3r05tMJhQUFNhNnz69dbC2A9W+efNmzRNPPDGxt56+38G2bdtcIyIiQvV6vZKUlOTfGzj7fxe9fQcEBHQ+9thjk/Ly8uz0er1SUFBgvWjRokaVSgWVSoXo6OiWysrKq9afvx6+J5RGhVAJRMycCN9QV3yzqxD/814BzubWIP6hEC77SUR0PZ895YuLhXYjek4PpRX3b60YbPfHH3/8S1ZWlnNWVlbxpk2bPBMSEppbKtcrAAAgAElEQVTS0tLKamtr1dHR0aHz589v0mq1hkOHDhXb2dnJvLw864ceemhyfn7+qU2bNlX1XR5zy5Ytg75Yvq2tTRUbG9uyc+fOyo6ODjF16tSQL774olSr1Rp27tzp8txzz/mkpaWVDeWS/Pz8OouKigqXLVvm+/jjjwdkZ2cXtbW1qSIiIsKef/75GgDIy8uzP3HiRL5Op+u87777gj/88EOXhISE5hdffNHn+PHjpzQajWHGjBm63bt3T0hOTm7oWx8ApKamumdlZRV7e3sbAOCNN96o8vT0NBoMBkybNi0kOzvbNjY2tg0A3N3dDYWFhadSUlI0KSkpnnv37i1fu3att5OTk7G4uLgQAGpqatTV1dUWr7zyivfBgweLnZycTOvWrfPauHGj52uvvVZ99OhRO0VRWlUqFQZqW1ZWZjlQ7UuXLq2fOnWqHkAlAKSnp7uuW7eu+scff7RJT093zcnJKbK2tpZLly712759u9vTTz99qf+1Hj582E5RlFZfX1/Dtm3bygda8rSjo0Ps3bvX7Y033hj0Z2kwDKE0qiZ42uGfnvsVftz3C45l/ty97GdyKPzDuewnEdFYdeDAAad9+/ZN2LJlixfQHTxKS0ut/P39u5YtW+ZfWFhoq1KpUF5ebj3cc6vVajz66KP1AHDy5EnrkpIS21mzZumA7lFAjUbTNdRzPfjggw0AEBER0drS0qJycXExubi4mKysrEy1tbXqnn0tiqJ09hxfd+jQIQdLS0s5derUy1qt1gAAixYtqsvKynJITk5u6FvfQD744APXXbt2uRsMBlFTU2OZm5tr0xtCk5KS6gEgJiamNSMjwwUADh486LRnz56zve01Go0xNTXV+cyZMzYxMTF6AOjq6hJRUVHNAJCZmemUmJjYNFjbffv2OQ5Wu6+vb8f+/fvtw8LC2s+cOWMzZ86c5pSUFE1+fr5dZGRkKAC0t7erPDw8DP2/i/59D+aRRx7xmzp1anNiYuKw38vIEEqjTqVWIfo3AfAPd8M3uwqR+U4uwmZoMW1BEKxs+CNJRHSVa4xYjgYpJdLT00sjIyP/5qH+Z599Vuvh4dH1ySef/GwymWBraxs1UHsLCwtpMv3vIiYdHR1XHge0srIyWVhY9PYjgoKC2n766aeiG6nTxsZGAoBKpYKVldWVmbAqlQpdXV0C6J6v0Ff/z/31ra+/oqIiq3feecezZxTSuGDBgoD29vYr19Zbj4WFhTQYDIN2JKVEXFxc0+eff37Vs7fffvutc0ZGxg0tNbpw4cK61NRUF71e3z537tx6lUoFKaVYuHDhpa1bt1Zd71qv1/cf/vAH79raWot9+/aduZH6+EwomY3GzxEL/zUa98zxQ8Hhc9j78g+oLm0wd1lERNRPQkJC0+uvv+7ZGySPHDliCwCNjY1qb2/vLrVajW3btrn1Pl/p7OxsbG5uVve2DwwM7CwoKLAzGo0oLS21PHnypP1A/dx1113tdXV1Ft9884090D3impOTYzOS15KXl2dfVFRkZTQakZ6e7jpjxozLM2bMaMnOznasrq62MBgMSEtLc505c+aAI3v29vbG3uc16+vr1ba2tiZXV1djRUWFxYEDB667bnV8fHzTm2++6dH7uaamRj1z5syWnJwch/z8fGsAaGpqUp08edL60qVLaqPRCC8vL+Ngba9V+5IlSxr27ds3IS0tzXXJkiV1AJCYmNiUmZnpUlVVZQEAFy5cUBcXF1/1PGf/vvt744033L/99lvnzz777KxarR7okOtiCCWzsrBUY9qCIPzjs78CAHz6+o84+mkpjF1c9pOIaKxISUk5ZzAYhF6vV4KCgsLWr1/vAwCrVq26mJqa6hYSEqIUFRXZ2NramgAgJiamTa1Wy5CQEOWll17ymDNnTrOvr29HUFBQ2MqVK/0URRnw5dE2NjZyz549Z9auXTsxJCRECQsLU7KyshxG8lrCw8NbnnzySb/AwMBwPz+/juTk5AZ/f/+uF154oSo+Pl4XGhoaFhkZ2bJ06dIBR0UeeeSR2sTERF1sbKzu3nvvbQsPD28NDAwMf/DBByf33kK/lldffbW6oaFBHRwcHBYSEqJ8+eWXjlqt1rBjx46yxYsXT9bpdEp0dLQ+Ly/PJiMjwyk+Pv7ytdpeq3aNRmMMCgpqr6qqsk5ISGgFgKioqPb169dXzZ49W6fT6ZRZs2bpKioqrpqc0b/v/p5//nn/2tpai+jo6FC9Xq8899xz3kP5378vMdbe2xgdHS1zcnLMXQaZQWe7AUfSS1F4+BzcfOwx+1EFGt9Rff0bEZG5XHWrNjc3tywyMrLWHMXcqTIzMx0HmlwzVi1atMh/xYoVtbNnz265XfvOzc11j4yMDBhoHx/AozHDysYCCUv1mBTpjr/uLkJ6Sg5i/mES7pnjB5Wag/ZERDS+7N27t/xO7pv/stOYExDRveznpEgNvv/sLP779R/RcIHLfhIRjVfJycl+er1e6fvn7bffHvZrVebNm3f5dhkFHQ84Ekpjko2DJf5+eRhKjrnj4J5i7N30A6b9UxDC432uO5ORiIjuLLt37/7F3DXQyONIKI1ZQgjoYryw+E+x8A6agIN7ivH5f+aiub7d3KURERHRTWIIpTHPwcUa//AvkYh/SIfq0gbs2fgDTmefx1ibVEdERERDxxBKtwUhBMLjJ2LRuhi4eNnhm/cLsW9nPtqaO81dGhEREd0AhlC6rUzwtMM/PheFqfdPxs+5tdiz4QeU5fENJkRERLcbhlC67ahUAlGJAVj4r9GwdbTEF1tP4tvdp9BU22bu0oiI7igvv/yyx+TJk8Pmz58/aTjtTp8+bbV9+3bXW1VXr1WrVmk/++yzUX2h9IYNGzwuX748qvnp3XffdVmzZo3XaPbZv++2tjYxbdo0nV6vV3bu3OnSu//RRx/1tbOzu+dGzs0QSrct94mOWLh2Cn71934oOlqN3eu/w//31gkUHzsPQ9eAq4wREdEwvPfee5qvv/66OCMj46o1za+lpKTEeu/evcMOoQaDYVjHv/XWW+fuv//+QVf1GWkGgwE7duzwbG5uHtX89NVXXznPmzevaTT77P0uevs+evSoHQAUFRUVLl++vB4ADh48aNfQ0HDDb1piCKXbmtpShXv/MQhLX74XMf8wCY0X2/D1e4XYteYIDqaeRs0vo/a7iYjojpKUlORXWVlpPXfu3OA1a9Z4LVy4MCAiIiI0NDRU+eijjyYA3SOeUVFRIYqihCqKEvr111/bA8C6det8cnJyHPR6vfLSSy95bNmyxe3hhx/26z13QkJCUGZmpiMA2NnZ3bN8+fKJISEhyv79+x0OHTpkN2XKlJCwsLDQuLi44PLy8quWlOy1YMGCgPfff98FAHx8fCKeeuopH71er4SHh4cePnzYLi4uLtjX1zd88+bNGqB7xaTo6OiQmTNnBgUEBIQnJSX59a53v2PHDledTqcEBweHrVy50qe3j771rV271vvixYuW8fHxutjYWB0ALFmyxC88PDw0KCgobPXq1dredj4+PhGrV6/WKooSqtPplBMnTtgAQGNjo+qBBx4I0Ol0ik6nU3bt2jUBAD799FOnu+++W68oSujcuXMn965PbzKZUFBQYDd9+vTWwdoOVPvmzZs1TzzxxMTeevp+B9u2bXONiIgI1ev1SlJSkn9v4Oz/XfT2HRAQ0PnYY49NysvLs9Pr9UpBQYG1wWDAH//4x4lvv/125Y39hA3xPaFCiEQAbwNQA/izlDKl3/5nAfwzAAOAGgCPSynLe/YZAeT1HPqLlHL+jRZLNBgnN1tM+e0kRM8NQOXpepw6Wo3CI9XIy6qCu68DQqd5QxfjBRv7QX+XERGNWX868iff0vpSu5E8Z5BLUOvG6RsrBtv/8ccf/5KVleWclZVVvGnTJs+EhISmtLS0straWnV0dHTo/Pnzm7RareHQoUPFdnZ2Mi8vz/qhhx6anJ+ff2rTpk1VfZfH3LJly6Avlm9ra1PFxsa27Ny5s7Kjo0NMnTo15IsvvijVarWGnTt3ujz33HM+aWlpZUO5Jj8/v86ioqLCZcuW+T7++OMB2dnZRW1tbaqIiIiw559/vgYA8vLy7E+cOJGv0+k677vvvuAPP/zQJSEhofnFF1/0OX78+CmNRmOYMWOGbvfu3ROSk5Mb+tYHAKmpqe5ZWVnF3t7eBgB44403qjw9PY0GgwHTpk0Lyc7Oto2NjW0DAHd3d0NhYeGplJQUTUpKiufevXvL165d6+3k5GQsLi4uBICamhp1dXW1xSuvvOJ98ODBYicnJ9O6deu8Nm7c6Pnaa69VHz161E5RlFaVSoWB2paVlVkOVPvSpUvrp06dqgdQCQDp6emu69atq/7xxx9t0tPTXXNycoqsra3l0qVL/bZv3+729NNPX+p/rYcPH7ZTFKXV19fXsG3btvK+3+nGjRs9fvOb3zT4+/t3DeW7Gch1Q6gQQg1gK4A5PRdyTAiRIaUs7HPYCQDRUspWIcRKAJsBLOrZ1yalvPtGCyQaDqES8A11hW+oK9pbulBy7AIKj5zDob0lOPrJGUy+2x2h07SYqHeBUPGl90REQ3HgwAGnffv2TdiyZYsXAHR0dIjS0lIrf3//rmXLlvkXFhbaqlQqlJeXWw/33Gq1Go8++mg9AJw8edK6pKTEdtasWTqgexRQo9EMOeQ8+OCDDQAQERHR2tLSonJxcTG5uLiYrKysTLW1teqefS2KonT2HF936NAhB0tLSzl16tTLWq3WAACLFi2qy8rKckhOTm7oW99APvjgA9ddu3a5GwwGUVNTY5mbm2vTG0KTkpLqASAmJqY1IyPDBQAOHjzotGfPnrO97TUajTE1NdX5zJkzNjExMXoA6OrqElFRUc0AkJmZ6ZSYmNg0WNt9+/Y5Dla7r69vx/79++3DwsLaz5w5YzNnzpzmlJQUTX5+vl1kZGQoALS3t6s8PDwM/b+L/n33VVZWZvnZZ5+5fP/996eH+t0MZCgjoTEASqWUZwFACLEHwO8AXAmhUsq/9jn+ewBLb6YoopFgY2+JiJkTETFzImp+uYxTR6tR/MN5lORchIOrNULv9YZ+mjec3GzNXSoR0TVda8RyNEgpkZ6eXhoZGdnRd/uzzz6r9fDw6Prkk09+NplMsLW1jRqovYWFhTSZTFc+d3R0XHkc0MrKymRhYdHbjwgKCmr76aefim6kThsbGwkAKpUKVlZWV14mrVKp0NXVJQBctere9Vbh61tff0VFRVbvvPOOZ88opHHBggUB7e3tV66ttx4LCwtpMBgG7UhKibi4uKbPP//8qmdvv/32W+eMjIwbWmp04cKFdampqS56vb597ty59SqVClJKsXDhwktbt26tut61Dtb3999/b1deXm4TEBAQAXQHWT8/v/Bffvklfzj1DeWZUB8AfX/4K3u2DWYZgP/X57ONECJHCPG9EOL+4RRHNFI0fo64b7EOj/77dPzdsjC4eNrh2JdlVyYzlRy7wMlMRESDSEhIaHr99dc9e4PkkSNHbAGgsbFR7e3t3aVWq7Ft2za33ucrnZ2djc3Nzere9oGBgZ0FBQV2RqMRpaWllidPnrQfqJ+77rqrva6uzuKbb76xB7pHXHNycmxG8lry8vLsi4qKrIxGI9LT011nzJhxecaMGS3Z2dmO1dXVFgaDAWlpaa4zZ85sHqi9vb29sfd5zfr6erWtra3J1dXVWFFRYXHgwAHn6/UfHx/f9Oabb3r0fq6pqVHPnDmzJScnxyE/P98aAJqamlQnT560vnTpktpoNMLLy8s4WNtr1b5kyZKGffv2TUhLS3NdsmRJHQAkJiY2ZWZmulRVVVkAwIULF9TFxcVW/evs33dfixcvbqytrc2tqqrKq6qqyrOxsTENN4ACIzwxSQixFEA0gP/os9lfShkNIAnAW0KIwAHaregJqjk1NTUjWRLR37CwVCN4iifmP3MPkjfeiym/CUDDxVb8z3sF3ZOZ9hRzMhMRUT8pKSnnDAaD0Ov1SlBQUNj69et9AGDVqlUXU1NT3UJCQpSioiIbW1tbEwDExMS0qdVqGRISorz00ksec+bMafb19e0ICgoKW7lypZ+iKK0D9WNjYyP37NlzZu3atRNDQkKUsLAwJSsry2EkryU8PLzlySef9AsMDAz38/PrSE5ObvD39+964YUXquLj43WhoaFhkZGRLUuXLm0YqP0jjzxSm5iYqIuNjdXde++9beHh4a2BgYHhDz744OTeW+jX8uqrr1Y3NDSog4ODw0JCQpQvv/zSUavVGnbs2FG2ePHiyTqdTomOjtbn5eXZZGRkOMXHx1++Vttr1a7RaIxBQUHtVVVV1gkJCa0AEBUV1b5+/fqq2bNn63Q6nTJr1ixdRUXFVRMm+vd9K4jrLX0ohLgXwItSyr/v+fyvACClfLXfcb8G8J8A4qWUFwc51y4AmVLK9MH6i46Oljk5OcO5BqKbIk0SlUX1OHX0HM78VAOTQcLd1wHKdC2Cp3hyMhMRjYarbtXm5uaWRUZGcjWOEZSZmenYd3LNWLdo0SL/FStW1M6ePbvldu07NzfXPTIyMmCgfUN5JvQYgGAhxCQAVQAWo3tU8wohxD0AdgBI7BtAhRAuAFqllB1CCHcA09E9aYlozBAqAV/FFb5K92Sm4h8u4NTRczi4pxhH0ksx+R4NQqd7Y6KOk5mIiGj07N27t/xO7vu6IVRKaRBCPA1gH7pf0fRfUsoCIcQGADlSygx03353AJDW84Bv76uYQgHsEEKY0H3rP6XfrHqiMcXG3hJ3JUzEXQk9k5mOnEPxsQsoOXYBjm42CJ3mDf293nB0HdFHlIiI6BqSk5P9jh079je35VeuXHnhmWeeuTSc88ybN+/yvHnz+MzVGHHd2/GjjbfjaawxdBpxNrcGp45Uo7KoHhCAb6grQqd5Y3KkBmpLrvlARDeNt+PpjnSzt+OJxjULKzV0U7ygm+KFpto2nPquGkVHq/E/fy6Atb0FdDFeUKZ7w33iqC5fTEREdFtjCCUaBid3W8T+w2RM+e0kVBbV4dTRahQcqkLeXyuh8XPsWZnJE9Z2nMxERER0LQyhRDdApRLwU9zgp7ihvbkLxcfOo/BIdfdkpk9KMfluDZTp3vDhZCYiIqIBMYQS3SQbB0vcleDbb2Wm7slMTu420N/LyUxERET9cUYF0QgRQsDD3wnxD4XgsX+fjjmPK3B0s8UPn/+MD9cdxX+//iNyvizD+bONMBlN1z8hEZGZvfzyyx6TJ08Omz9//qThtDt9+rTV9u3bXW9VXb1WrVql/eyzz0b1gfwNGzZ4XL58eVTz07vvvuuyZs0ar9Hss3/fbW1tYtq0aTq9Xq/s3LnTxWQy4V/+5V98AgICwidPnhz28ssve1z/bH+LI6FEt4CFlRq6GC/oYnomMx2txs8na5GdcRbZGYCVjRpanQsm6rv/uHrbX3f9YiKi0fbee+9pvvnmm+LAwMCu4bQrKSmx3rt3r+uTTz5ZN5x2BoMBg63TPpC33nrr3HDOf7MMBgN27NjhuXz58jpHR8dRG0346quvnFevXj3gQkC3Su930dv30aNH7QCgqKioEADefvttt8rKSsszZ87kq9Vq9C4DOhwcCSW6xZzcbRE7fzIWr4/B4/8Rh7/75zAERXui7lwzDv+lBHs2/IBda47g6/8qwKmj53C5rt3cJRMRISkpya+ystJ67ty5wWvWrPFauHBhQERERGhoaKjy0UcfTQC6RzyjoqJCFEUJVRQl9Ouvv7YHgHXr1vnk5OQ46PV65aWXXvLYsmWL28MPP+zXe+6EhISgzMxMRwCws7O7Z/ny5RNDQkKU/fv3Oxw6dMhuypQpIWFhYaFxcXHB5eXlg870XLBgQcD777/vAgA+Pj4RTz31lI9er1fCw8NDDx8+bBcXFxfs6+sbvnnzZg3QvWJSdHR0yMyZM4MCAgLCk5KS/HrXu9+xY4erTqdTgoODw1auXOnT20ff+tauXet98eJFy/j4eF1sbKwOAJYsWeIXHh4eGhQUFLZ69WptbzsfH5+I1atXaxVFCdXpdMqJEydsAKCxsVH1wAMPBOh0OkWn0ym7du2aAACffvqp0913361XFCV07ty5k3vXpzeZTCgoKLCbPn1662BtB6p98+bNmieeeGJibz19v4Nt27a5RkREhOr1eiUpKcnfYDBcda379+936O07ICCg87HHHpuUl5dnp9frlYKCAus///nPHhs3bqxWq9W912sY7s8YR0KJRpGtoxWCoz0RHO0JAGiqbUPl6XpUnqpDxak6FP9wAQDg7GGLiXpXTAxxwcQQF9g4cLY90Xh27t/W+XaUlNiN5Dmtg4Nbta9sqhhs/8cff/xLVlaWc1ZWVvGmTZs8ExISmtLS0spqa2vV0dHRofPnz2/SarWGQ4cOFdvZ2cm8vDzrhx56aHJ+fv6pTZs2VfVdHnPLli1ug/XT1tamio2Nbdm5c2dlR0eHmDp1asgXX3xRqtVqDTt37nR57rnnfNLS0sqGck1+fn6dRUVFhcuWLfN9/PHHA7Kzs4va2tpUERERYc8//3wNAOTl5dmfOHEiX6fTdd53333BH374oUtCQkLziy++6HP8+PFTGo3GMGPGDN3u3bsnJCcnN/StDwBSU1Pds7Kyir29vQ0A8MYbb1R5enoaDQYDpk2bFpKdnW0bGxvbBgDu7u6GwsLCUykpKZqUlBTPvXv3lq9du9bbycnJWFxcXAgANTU16urqaotXXnnF++DBg8VOTk6mdevWeW3cuNHztddeqz569KidoiitKpUKA7UtKyuzHKj2pUuX1k+dOlUPoBIA0tPTXdetW1f9448/2qSnp7vm5OQUWVtby6VLl/pt377d7emnn77U/1oPHz5spyhKq6+vr2Hbtm3lfb/TiooK6927d7t88cUXLq6uroatW7f+EhER0TGU76kXQyiRGTm520Jxt4UyXQspJerOtaDiVB0qT9ejOPs8Cg5WAQLQ+Dp2B1K9C7yDJ8DSSm3u0oloHDlw4IDTvn37JmzZssULADo6OkRpaamVv79/17Jly/wLCwttVSoVysvLrYd7brVajUcffbQeAE6ePGldUlJiO2vWLB3QPQqo0WiG/CjAgw8+2AAAERERrS0tLSoXFxeTi4uLycrKylRbW6vu2deiKEpnz/F1hw4dcrC0tJRTp069rNVqDQCwaNGiuqysLIfk5OSGvvUN5IMPPnDdtWuXu8FgEDU1NZa5ubk2vSE0KSmpHgBiYmJaMzIyXADg4MGDTnv27Dnb216j0RhTU1Odz5w5YxMTE6MHgK6uLhEVFdUMAJmZmU6JiYlNg7Xdt2+f42C1+/r6duzfv98+LCys/cyZMzZz5sxpTklJ0eTn59tFRkaGAkB7e7vKw8PD0P+76N93f52dncLGxkbm5+ef+uCDDyY8+uijAcePHz891O8KYAglGjOEEHDzcYCbjwPu/rUfjEYTLpZdRmVRHSqL6pH7bQVOfP0LVBYCXpOcMVHvAt9QV3j4O0Kl5pM1RHeya41YjgYpJdLT00sjIyP/ZqTr2Wef1Xp4eHR98sknP5tMJtja2kYN1N7CwkKaTP/7CGVHR8eVX1pWVlam3udApZQiKCio7aeffiq6kTptbGwkAKhUKlhZWV1ZElKlUqGrq0sAuOr5++s9j9+3vv6Kioqs3nnnHc+eUUjjggULAtrb269cW289FhYW0mAwDNqRlBJxcXFNn3/++c/993377bfOGRkZpdcschALFy6sS01NddHr9e1z586tV6lUkFKKhQsXXtq6dWvV9a71Wn17enp2PvTQQ/UAkJyc3PD0008HDLc+/stFNEap1Sp4Bzpjym8n4R//8Cv88xv3Yd6/ROKuBF90thvwQ+bP+GTzcfz5D4fwxdZc5O6vwKWqZoy1pXiJ6PaXkJDQ9Prrr3v2BskjR47YAkBjY6Pa29u7S61WY9u2bW69z1c6Ozsbm5ubr9yyCQwM7CwoKLAzGo0oLS21PHnypP1A/dx1113tdXV1Ft9884090D3impOTM6Lvt8vLy7MvKiqyMhqNSE9Pd50xY8blGTNmtGRnZztWV1dbGAwGpKWluc6cObN5oPb29vbG3uc16+vr1ba2tiZXV1djRUWFxYEDB5yv1398fHzTm2++eWUmeU1NjXrmzJktOTk5Dvn5+dYA0NTUpDp58qT1pUuX1EajEV5eXsbB2l6r9iVLljTs27dvQlpamuuSJUvqACAxMbEpMzPTpXci0YULF9TFxcVW/evs33d/c+fObfjqq68cAeDLL7909Pf3H9ateIAjoUS3DUtrNfzD3OAf1v1oVVtzJ6pON1wZKS3LuwQAsHWyunLrfqLeBU5utuYsm4juACkpKedWrFjhp9frFZPJJHx9fTv++te/lq5ateriggULAvfs2eM2a9asRltbWxMAxMTEtKnVahkSEqIkJSXV/ulPf7q4devWjqCgoLCgoKB2RVFaB+rHxsZG7tmz58zvf/97v8uXL6uNRqNYuXLlhejo6BGbsRkeHt7y5JNP+pWVldlMmzatqfeW+wsvvFAVHx+vk1KKX//61w1Lly5tGKj9I488UpuYmKjz9PTszM7OLg4PD28NDAwM9/b27uy9hX4tr776avVjjz3mFxwcHKZSqeS//du/nXvkkUcaduzYUbZ48eLJnZ2dAgBeeOGFquPHj6vi4+MvX6/tYLVrNBpjUFBQe0lJiW1CQkIrAERFRbWvX7++avbs2TqTyQRLS0u5ZcuWX3Q6XWffOjMyMpz69t3fhg0bzj/wwAOTtm3b5mlnZ2fauXNn2ZC+gD7EWBs1iY6Oljk5OeYug+i203SpDZVF9d1/Ttejran794mTxrb71r3eFT4hE2DrcNV/8BKR+V11qzY3N7csMjKy1hzF3KkyMzMd+06uGesWLVrkv2LFitrZs2e33K595yCeXUkAACAASURBVObmukdGRgYMtI8joUR3CCc3WyjT/3aSU28gLTl2AYWHul+n5+7r0D3zXu8CbdAEWFpzkhMR0Vi0d+/e8ju5b4ZQojtQ30lOkbN9YTKacLH8fyc5nfxrBX76+heo1AKek5zgG9r9OiiPACeoLfioOBGNLcnJyX7Hjh1z6Ltt5cqVF5555plLwznPvHnzLs+bN2/QW8w0uhhCicYBlVoFr8nO8JrsjOjfTEJXpxHVpQ1Xbt//kPkzfvj8Z1hYquA5yQneQRPgHdR9vJUNf00QkXnt3r37F3PXQCOP/7oQjUOWVmr4KW7w+//Zu/PwqMq7b+Dfe/Y1k22yJwQIAUIAlU1kXxTwqdpHRSxubd1ra7W1qLVWu1BpaxWt9q21WnxaRQStIhatgghii+yEJRACAQJZZrLOvp37/eOcSWayQEImmczM73Ndc51lzplzz0lm8s29nFMiDnJy2304W9GEcxXNqDnegt0bq8A5wGQM6XkG5BQlI3uECdnDk6FLoj6lhBBC+o5CKCEEGoMSwy/NwPBLxSt/eN1+1J5oQc3xFpyraMbBbWexf7N4mcLkTB2yi0xiMC1KRlK6hu57TwghpNcohBJCOlFpFGE1pQGfAMsZm1RT2owTey04sr0GAKA3qaTm+2TkjDAhNccAmYxCKSGEkPOjEEoIuSC5sr1PKRYMARc4GmscqDnejHPHW1BzvBnHd9cDAFRaBbKHm5BdZEJ2UTIyhyRBrqTBToQQQsJRCCWE9BqTtY++L52VB845bA1uMZRWtqCmohmnDoqDVuUKGTIKjVK/0mRkDTNBraWvHkJiwa9//euM119/3VxaWupcv359p1tKdufo0aOqzz//3HDfffc19mf5HnrooZzZs2fbvvnNbw7YiPdf/vKXGQ8//LDVaDQKF946Mv7yl7+kVFZWqn/729/WDtQxOx776aefrps3b96IxsZGxY9//OOajIwM/+OPP54nCALT6/WBN954o6q0tLRXd02ivwSEkD5jjCEpXYukdC1GXp4NQLyjU83xFpw7Lg522vPv0+AfnwJjQFqeQWy+l0bh603qKL8DQkhXXnvtNfNnn312bPjw4b7e7FdRUaFes2ZNam9DqN/vR3f3ae/KypUrz/Xm9fvK7/fjlVdeybz77rsbBzKEfvzxx6aHH364fqCOB7T/LILH/uqrr3QAUF5efhgACgsLS997773jl112mXvFihXmp556Kvvdd9+t6s0xqI2MENIvtAYVhl1ixvQbR2DxYxNx9/Mzce1Dl2Di1YVQ65Q4sv0cPnn1IFY9uh3/ePI/2PTGYRzefg7NdU4Mtju5EZKIli5dWlBdXa1etGjRiEcffTRr8eLFhWPHjh09evTokn/84x/JgFjjOWHChJElJSWjS0pKRn/66ad6AHjiiSdyd+3aZRg1alTJL37xi4wXX3wx7fbbby8IvvacOXOKNmzYYAQAnU536d133503cuTIkk2bNhm2bdummzRp0sgxY8aMnj59+ohTp04puyvjDTfcUPi3v/0tBQByc3PHPvDAA7mjRo0qKS0tHf3ll1/qpk+fPiI/P7/0d7/7nRkQ75g0ceLEkbNnzy4qLCwsXbp0aUHwfvevvPJKanFxccmIESPG3H///bnBY4SW77HHHsuur69Xzpo1q3jKlCnFAHDLLbcUlJaWji4qKhrz8MMP5wT3y83NHfvwww/nlJSUjC4uLi7Zu3evBgBaWlpkN954Y2FxcXFJcXFxyapVq5IB4L333ku65JJLRpWUlIxetGjRsOD96QVBwKFDh3TTpk1zdrdvV2X/3e9+Z7733nvzguUJ/Rn86U9/Sh07duzoUaNGlSxdunSI3+/v9F43bdpkCB67sLDQ+53vfGdoWVmZbtSoUSWHDh1SA0Bzc7Ncek/y7OzsXv2jAiRATeinh+twSX4yzEaqaSEkmpRqOfJHpSJ/VCoAIBAQYD1tl2pKm1F1oAHl/xFbmrRJKuQUiZeEyhmRjLQ8GuxEEtum/zuS33jWrovka6bmGpzzbh99prvn33rrrdNffPGF6Ysvvji2fPnyzDlz5rSuXbu2ymq1yidOnDj62muvbc3JyfFv27btmE6n42VlZepvfetbww4ePHhk+fLlZ0Nvj/niiy+mdXccl8slmzJliuPVV1+t9ng87PLLLx/50UcfHc/JyfG/+uqrKY888kju2rVrq3ryngoKCrzl5eWH77zzzvzvfve7hTt27Ch3uVyysWPHjlm2bJkFAMrKyvR79+49WFxc7J05c+aI//u//0uZM2eO/emnn87dvXv3EbPZ7J8xY0bx3//+9+TbbrutObR8ALB69er0L7744lh2drYfAJ577rmzmZmZAb/fjyuuuGLkjh07tFOmTHEBQHp6uv/w4cNHVqxYYV6xYkXmmjVrTj322GPZSUlJgWPHjh0GAIvFIq+pqVH85je/yd66deuxpKQk4Yknnsj61a9+lfnss8/WfPXVV7qSkhKnTCZDV/tWVVUpuyr7rbfe2nT55ZePAlANAOvWrUt94oknavbs2aNZt25d6q5du8rVajW/9dZbC/785z+nff/732/o+F6//PJLXUlJiTM/P9//pz/96VToz/TPf/5z1fXXXz9CrVYLBoMhsHPnziM9+RmFiusQWtfqxgNv7cHMEel49faJdBkZQgYRuVy8MH7m0CRcemUBuMDRVOeUBjs1o6aiBZV7LAAApUaO7GEmZBQmwZCihj5ZDUOKBoZkNdR6BX22CelnW7ZsSfrkk0+SX3zxxSwA8Hg87Pjx46ohQ4b47rzzziGHDx/WymQynDp1qtc1PnK5HN/+9rebAODAgQPqiooK7dy5c4sBsRbQbDb3uIbtpptuagaAsWPHOh0OhywlJUVISUkRVCqVYLVa5dJzjpKSEq+0feO2bdsMSqWSX3755bacnBw/ACxZsqTxiy++MNx2223NoeXryhtvvJG6atWqdL/fzywWi3L//v2aYAhdunRpEwBMnjzZuX79+hQA2Lp1a9Lbb799Iri/2WwOrF692lRZWamZPHnyKADw+XxswoQJdgDYsGFD0sKFC1u72/eTTz4xdlf2/Px8z6ZNm/RjxoxxV1ZWaq688kr7ihUrzAcPHtSNHz9+NAC43W5ZRkaGv+PPouOxO3ruuecy33vvvYq5c+c6nnzyycz7778/v7e3+ozrEJqZpMGyBSPx64+OYM3OM7h5csGFdyKERAWTMaRm65GarceYGWJLmK3RjZpKMZCeO96M0xurgA4t9XKlDHqTSgymyWropXCqb3uooDepE/J2pEJAgNvhh9vug9vhhcvug98rwJiqgSlDC12SigJ8jDhfjeVA4Jxj3bp1x8ePHx828ORHP/pRTkZGhu/dd989KQgCtFrthK72VygUXBDau1B6PJ62D6RKpRKC/UA556yoqMi1b9++8ospp0aj4QAgk8mgUqnavi1kMhl8Ph8D0Ol3/kKfgdDydVReXq566aWXMqVayMANN9xQ6Ha7295bsDwKhYL7/f5uD8Q5x/Tp01s//PDDToO/Nm/ebFq/fv3x8xayG4sXL25cvXp1yqhRo9yLFi1qkslk4JyzxYsXN7z88stnL/Reuzv2uXPnFEeOHNHOnTvXAQC3335708KFC0f0tnxxHUIB4LvThmJzeT1+ueEwpg5Pw5A0fbSLRAjpIWOqBsbULBRPygIgNuE7W7xwNHtgb/LA0Sw+7NK07pQNjv1WBHwdxgswQGtUhYVTQ7IK+uSQwJqihkojH7ShjAscHpcYKF12H9x2rzQVHy5HcL59vcfpP+9rKtRymMxaJGdoYcrQSfM6Cqikkzlz5rT+4Q9/yFy1atVpmUyG7du3a6dNm+ZqaWmR5+XleeVyOV566aW0YP9Kk8kUsNvt8uD+w4cP97766qu6QCCAkydPKg8cONDlH+Nx48a5GxsbFZ999pl+/vz5Do/Hw8rKytQTJ050R+q9lJWV6cvLy1UjRozwrlu3LvWuu+6yzJgxw7Fs2bL8mpoahdls9q9duzb1e9/7XpcDgfR6faClpUWWnZ2NpqYmuVarFVJTUwNnzpxRbNmyxTRr1qzzjtSfNWtW6/PPP5/x+uuvnwHEJvXZs2c7fvzjHxccPHhQXVpa6mltbZVVVVUpc3Nz/YFAAFlZWYHu9j1f2W+55Zbmyy67LLusrMy7YsWKagBYuHBh6/XXX1/005/+tC43N9dfV1cnb2lpkRcXF3tDy9nQ0CAPPXYos9nst9vt8gMHDqjHjRvn2bBhQ1JRUVGvf0ZxH0JlMoZnF4/HgpVb8fCafXjn3qlQyBOvRoSQeCCXy6Rgqul2G845PE5/WFANhlRHswe2RjdqK1vgdnRu4VOo5W2hNDywiiHVkKyGNknV5/6pnHP4PIGQQBkeHtsDZXvgdDv84ELXA7bkChm0RiU0BiU0eiUyCjTQGFTQGJTQGqT10rxcIYOtwY3mehdaLE601LvQcNaBk/usEEJePyygmsVgSgE1ca1YseLcPffcUzBq1KgSQRBYfn6+5/PPPz/+0EMP1d9www3D33777bS5c+e2aLVaAQAmT57sksvlfOTIkSVLly61Pvnkk/Uvv/yyp6ioaExRUZG7pKTE2dVxNBoNf/vttysffPDBApvNJg8EAuz++++vi2QILS0tddx3330FVVVVmiuuuKI12OT+1FNPnZ01a1Yx55zNnz+/+dZbb23uav877rjDunDhwuLMzEzvjh07jpWWljqHDx9emp2d7Q02oZ/PM888U/Od73ynYMSIEWNkMhn/6U9/eu6OO+5ofuWVV6puvvnmYV6vlwHAU089dXb37t2y0FDb3b7dld1sNgeKiorcFRUV2jlz5jgBYMKECe6f/exnZ+fNm1csCAKUSiV/8cUXT3cMoevXr0/qLlArlUq88MILp2688cbhjDGYTKbAqlWrenwJryA22EahTpw4ke/atSvir/vBvrP44dv78JMFI/HAnKKIvz4hJLb4fQE4mr3ttaldBFZHiwdCIPw7kskYdElS839K58DKAxwuh7dDwOwQNh0+CP6uv3tlMhYWGsV5lTiv77heCa1BBYVK1udQKAQE2BrdaKl3hQXUFosLrRZXDwKqWJtKAfWidTpp+/fvrxo/frw1GoWJVxs2bDCGDq4Z7JYsWTLknnvusc6bN88Rq8fev39/+vjx4wu7ei7ua0KDrh2fg08P1+H5T49hVrEZpbmmaBeJEBJFCqUYpExmbbfbcIHDZfd1Cqf2JjcczR401TpRXd4Er6ubZm8GaHTtwTEpXYOMQqMUKLuuqVRpozPQSiaXiYHSrEPBmPDnxIDqQUu98yJrUCmgEnIxejvQJ9aOnTA1oQDQ7PRiwcqtMGqU2PCD6dAo5RfeiRBCLsDr9sPZ4oW92QOZnLUFS7VOGfeXlgoNqC0WF5rre1CDapb6oFJADUU1oedx2223FezcudMQuu7++++v++EPf9gQrTKRnqGaUEmyToVnF4/Hba99jRUby/H0tWMuvBMhhFyASqOASqNAcmZEL+MYE8Qa1K5rlLsLqA3nHDi5v5saVLN45y1jmtj3NzhV0a1eE9rf//7309EuA4m8Hn2qGWMLAbwAQA7gr5zzFR2e/xGAuwD4AVgAfJdzfkp67g4AP5M2/TXn/I0Ilf2izBhhxrevKMSqr6owb3QGZowwR7M4hBASt3odUC1SQC2zduozq9YpYEgND6ah81qjMh5rUgVBEJhMJhtcTZaE9JAgCAxAt7c3vWAIZYzJAbwM4EqIV93fyRhbzzk/HLLZXgATOedOxtj9AH4HYAljLBXAUwAmQry6325p324v+joQHl04CtsqLHhk7X588tBMJOtU0SwOIYQknPMFVC5wOG1e2BrcsDW626eNbrRaXTh7rAk+d/hVYxRKmRhSuwioxjQN9MnqWOwacdBisZSYzeYWCqIk1giCwCwWiwnAwe626UlN6GQAxznnJwCAMfY2gOsAtIVQzvnnIdv/F8Ct0vwCAJ9yzhulfT8FsBDA6l68j4jTquRYueRS/O+ftuPJDw7hj9+6NJrFIYQQEoLJGPQmNfQmNbKGdR5EGrwMV8eAapfmrWdscNl8nV7TkKzu1MwfnDekqqEYZOME/H7/XbW1tX+tra0tBUDXFiSxRgBw0O/339XdBj0JobkAQu/UUA1gynm2vxPAxvPsm9uDY/a7sXkmPDR/BJ799zHMH52B6y4ZFMUihBByAYwx8XJVeiXM+cYut/F5A7CHhtSQsHr2WBMczR50HJerTVJ1WYsanKoHuF/qhAkT6gFcO6AHJWQARfQTxRi7FWLT+6xe7ncPgHsAoKBg4G6ted+s4dhcXo8n3z+ISYWpyEnu/lIthBBCYodSJUdKlh4pWV3fJS8QEOBo8rQF09Cwaq22oeqAFQF/eFc2lVbRFkgnLBzSZS0tIaTnehJCzwLID1nOk9aFYYzNB/AEgFmcc0/IvrM77Lul476c878A+AsgXqKpB2WKCIVchuduugRXv7gNP1m3H3//7pRY7DNECCGkl+RyGZLSxZH4XWnrlxoSUO1tQdUFIdDtWAtCSA9d8DqhjDEFgGMA5kEMlTsBLOWcHwrZ5lIA6wAs5JxXhKxPBbAbwGXSqj0AJgT7iHalP68T2p3VX5/G4++V4clvlODO6UMH9NiEEEIIurhOKCHx7oIdnTnnfgDfB/AJgCMA3uGcH2KM/ZIxFuyr8nsABgBrGWP7GGPrpX0bAfwKYnDdCeCX5wug0XLzpHzMG5WB335cjmN1Xd4mlRBCCCGERFBC3THpfCw2Dxas3IqsJA3ef2AaVAoaiEgIIWTAUE0oSTiUtCRmoxorrh+LwzWtWPnZsWgXhxBCCCEkrlEIDXHVmCwsmZiPP39RiZ1Vg67XACGEEEJI3KAQ2sGT15QgN0WLH72zD3aPP9rFIYQQQgiJSxRCOzCoFXj+pktwtsmFX314+MI7EEIIIYSQXqMQ2oWJham4b9ZwrNl1Bv8+VBvt4hBCCCGExB0Kod14aH4xSrKT8Ph7ZbDYPBfegRBCCCGE9BiF0G6oFDKsvPkS2Dx+PPbuAQy2S1kRQgghhMQyCqHnUZxpxKMLR2FTeT3e3nkm2sUhhBBCCIkbFEIv4DtXFGJaURp+teEwqqyOaBeHEEIIISQuUAi9AJmM4dnF46GQMfzonX3wB4RoF4kQQgghJOZRCO2BbJMWv/pmKfacbsafv6iMdnEIIYQQQmIehdAeuu6SXFwzPgcrP6tAWXVLtItDCCGEEBLTKIT2wq+uG4N0gxoPrdkLty8Q7eIQQgghhMQsCqG9kKxT4feLx6HS4sCKjeXRLg4hhBBCSMyiENpLM0aY8e0rCrHqqypsPWaJdnEIIYQQQmIShdCL8NiiURhu1uMn6/aj2emNdnEIIYQQQmIOhdCLoFHK8cLNl6LB7sUT7x+kuykRQgghhPQShdCLVJprwsNXFuOjAzVYv/9ctItDCCGEEBJTKIT2wb0zh2HCkBT87P2DONfsinZxCCGEEEJiBoXQPlDIZXjupvEICByPrN0PQaBmeUIIIYSQnqAQ2kdD0vT4+TdK8FVlA17ffjLaxSGEEEIIiQkUQiNgyaR8zB+dgd99chRHa23RLg4hhBBCyKBHITQCGGN45vpxMKoVeGjNPnj8dDclQgghhJDzoRAaIWajGs9cPxZHalqx8rOKaBeHEEIIIWRQoxAaQVeNycKSifn48xeV2FnVGO3iEEIIIYQMWhRCI+zJa0qQn6LDw2v2web2Rbs4hBBCCCGDEoXQCDOoFXjupvE41+zCrzYcjnZxCCGEEEIGpbgOoVwQcPLGxah56mnYt22D4B2Y+7xPLEzF/bOH451d1fjkUO2AHJMQQgghJJYool2A/iTYbFDm5KDlww/RvGYNZHo99DNmwDhvLgwzZ0JuMvXbsX84rxhbjlrw+HtluLQgGRlGTb8dixBCCCEk1jDOB9ddfiZOnMh37doV0dcUPB44//tf2D7bBNvnnyNgtQIKBXQTJ8I4dy6M8+ZCmZsb0WMCQEWdDf/zxy8xvSgdr90xEYyxiB9jIAhcgF/wQyVXRbsohBASr2LzDwQhfZAQITQUFwS4DxyAbdNm2DZvhreyEgCgHj26LZCqR4+OWGB8/cuT+OWGw/jN/47F0ikFEXnNgeT0OXHHx3egvLEcRqURqdpUpGpSkaZJQ6omFana9vk0rbROk4okVVLMhm5CCIkC+sIkCSfhQmhH3qqqtkDq2rMH4ByKnGwY54iBVDdpEphSedGvLwgct72+A3tONWPjD2egMF0fwdL3vye+fAIbTmzAt8d8G56ABw2uBjS6G9umzZ5mcHT+HVLIFO1hVQqqXQXX4LJSdvHnmBBC4gCFUJJwehRCGWMLAbwAQA7gr5zzFR2enwlgJYBxAG7mnK8LeS4AoExaPM05v/Z8xxroEBrK39AA+5YtsG3aDMdXX4G73ZAZjTDMnAnjvLnQz5wJucHQ69etaXFhwfNbMTzDgLX3ToVCHhvjwTac2IDHtz2O+8bfhwcueaDLbfyCH82eZjS4GtDgFoNpo6uxfT4ksDa4GuAVuh4clqRKCqtJ7RReQ4KrXqmnWlZCSLyhLzWScC4YQhljcgDHAFwJoBrATgDf4pwfDtmmEEASgEcArO8QQu2c8x4nt2iG0FCCywXHV1/Btmkz7J9/jkBTE6BUQj9lijiwae5cKDMze/x66/efw4Or9+LHVxbjB/NG9GPJI+N062ks/nAxRqWOwmsLXoNC1vcxbJxzOHwOMZC6G9rCauh8aGht9bZ2+Tpqubq9FjWkG0CaJg3p2nSka9ORphXnqVsAISRG0BcVSTg9CaFTATzNOV8gLT8OAJzzZ7rYdhWADfEQQkPxQACuvXulZvtN8J06DQDQlJZKgXQe1MUjLhh2Hly9F/8qq8F737sC4/KSB6LoF8UX8OG2jbfhjO0M3r32XWTps6JWjmBtasfg2tWyX/B3eg2FTCHWpkqhNBhU07TiutBlo9JIgZUQEi305UMSTk9C6I0AFnLO75KWbwMwhXP+/S62XYXOIdQPYB8AP4AVnPP3z3e8wRhCQ3HO4a2sbAuk7v0HAADKvDwxkM6bB91ll4EpOtcctjh9WLByK/RqOTb8YAa0KvlAF79Hnt35LN44/AZWzlmJeQXzol2cHuGco9XbCqvLigZXgzh1N7Qvu61odDXC6rKi0d2IAA90eg2VTBUWVoNBtWN4TdemQ6fQUWAlpB8E/yYl4Ocr4d4wIQNxndAhnPOzjLFhADYzxso455WhGzDG7gFwDwAUFAzuEeSMMaiLiqAuKkL6vffAV1cP++efw7Z5E5reWo3GN/4PcpMJhtmzYZg3F4Zp0yDTi4ORTDolnl08Hre+tgMrNh7BL64rjfK76Wxb9Ta8cfgNLBm5JGYCKCD+XExqE0xqE4YnDz/vtgIX0OJp6RRUg/1arS4rahw1KLOWocnTBIELnV5DI9e0h1RN+nlrWnVKXX+97YvCOQcHh8AFcPC2Zc45VHIVZCw2+iyT+NDkbkKZtQwHLAdQZi3DQetBtHpboZaroVFooJaroVVoxWW5pm2dRqGBRq6BWtG+vqvnO76GWqGGVq4N249+5wmJjn5vju/N88Dgrwk9n4DdAceXX8K2eRPsX2yF0NICplJBP3UqDPPmwjhnDhRmM37x4SH8bXsV3vjuZMwqNke72G0sTgtu/PBGpGnT8NbVb0GjoAvsB4QAmjxNnUJqsHY1WOva6G5Ek7upyysF6BQ6pGnToFfqwwIggE5BUOBCWCgUIC0H13Wxr8AFgKN92+C+Xb12F+XrSKvQhj10Sp04Veh6vk6pDXtep9BBKacrICQ6T8CDIw1HcNB6EAesB1BmKUO1vRoAIGMyFCUXYWz6WKRr0+EJeOD2u9um7oC7fTlk3uV3tW3TVQtHTyhlys4hNjjfRWhVy9W4Ztg1KEopiuTpoZpQknB6EkIVEAcmzQNwFuLApKWc80NdbLsKISGTMZYCwMk59zDG0gH8B8B1oYOaOorlEBqK+3xw7t4jBtJNm+E7exZgDNrx46GZPQc/PmdChToNnzw0Eyn66F8EXuAC7v30Xuyr34c131iDYcnDol2kmOMX/GhyN3XuCiAtO31OMMYgg0ycSrUvMiYDAwNjDAys62VpPnRd6L5dLjO0HSu47/mOBYghweVzweV3wel3ilOfOO1qXU9CbZBCpggPpsqQANvDMGtUGdtqnCnUDm6cc5xqPRVWy3m06Whb3+0MXQbGpY/DWPNYjE0fizFpY/rcauATfPD4w0NqaIDtar3H74Er4ILH7+kUaoPPh+4fXH5u9nOYnT87AmeqDYVQknB6eommqyFegkkO4HXO+XLG2C8B7OKcr2eMTQLwTwApANwAajnnYxhjVwB4BYAA8T71Kznnr53vWPESQkNxzuE5ehS2TWIgdR8WM3i1wYz6sZNx3T03QJWfD0VGBmSq6ATSv5b9FS/seQFPT30aNxTfEJUykNjCOYcn4Ok6rErzwed6u84n+C54/GR1MtK16TBrzeJVEXQh8yHr6ZJeA6O7ZnVArF0vTS/F2PSxGJc+DqXppcjU9/zqIoMR5zzSv1f0S0oSTsJfrD4afDU1sG3ejCPrNiCp/AAUIX0O5ampUGRmQpmRAUVWFhSZGVBmZkKRmQVlZgYUmZmQGSM7inu/ZT/u2HgH5g+Zj9/P/D39wSZR5xN8YijtUCtr89pgdVlhcVlgdVphdVnbl13WLsOrVqFFmiYNZl3ngJquTW9bn6JOgVw2OAcLDjaegAfljeUos5Sdt1l9bPpYjDWPxXDTcDq3F0ZfvCThUAiNooDAcduLmyAcOoirMhgmG/xIc7XAX1cPX309/LW14vVJO2A6nRhSMzOlkJrVPp+VBUVGJhTpaWDyC3/pt3pbcdOHNwEA1l6zFkaVMeLvtqnKvQAAIABJREFUk5CBELxCgsVpgdVtFachIbXB1dAWXm0+W6f95UyOVE1qWDgNhtfQ0JquTU+o/tLRaFZPUBRCScKhEBpltS1u/PbjcvyrrAYev4BRWUbcNDEf37w0F6l6FQSvF34pkPrq6uCvq4e/rha+unr46+rgq6uFv94C+DtcI1MuhyI9HYqsTCgzMsXa1SxxqsgQ5+UZGXj06yfx2anP8MaiNzDePD46J4GQAeb2u9sCajCkWpwWNLgbwsJrg7uhy6sjGJXGtub/NG1aWEg1qU3QKXTQKXXQK/XQK/VtfVpjoZUh0ZrVB5HB/8tBSIRRCB0kWt0+rN93Dmt3ncH+6hao5DJcWZKJxRPzMGOEGXJZ999PXBAQaGyEr7YO/vo6KZzWwS8t++rEECs4HJ32tWsAlpGOjCGjxZrUjEwxuGZKgTUzE/Lk5Jj440lIpAWvjmB1da5ZbZuX1rsD7vO+FgODTqmDTiGGU61CGxZSg6FVp9RBrxCnofPB54L76xS6PjdxU7P6oEJfsiThUAgdhMprW/HOzmr8c281mpw+ZJs0uHFCHhZPyEdB2sU3cwXsDjGk1tbi7MmDeHvbyxjhT8V01WipC0AdAtYGoMPvBFOroRlbiqSFi5C04CoozIPnslKEDAbBW9JaXBbYvXY4/A44fU44fOLU6RfnHT4HXH5X27zT7+zy+Z7SyDVhwVSv1EOr1EKv0IeF1tBa2YAQwOGGw52a1TN1mW1hk5rVo4JCKEk4FEIHMY8/gE1H6vHOrjPYeswCgQNTh6VhyaR8LCzNgkZ5cTUSbr8bS/+1FA2uBqy7Zh3MuvZQyX0++C0WqelfqlU9VwPHV9vhqTgOyGTQTZqEpEWLYLzqSihSUyP1dgkhEC+XFgyqTp+zLdC2hVp/1+E2GGhDt3H5XHD4HZ26FHRsVh9rHosMXUaU3jGRUAglCYdCaIyoaXHh3d3VeGdXNU43OmHUKHDt+BwsmZSPsbmmXjWX//q/v8aao2vw/+b/P0zPnd7j/TwVFWjd+DFaN26E9+RJQC6HfsoUJF29CMb58yFPTr6Yt0YI6Uecc7gD7rYgG+AB5BvzqVl98KEQShIOhdAYIwgcO0424p1dZ7odzHQ+m05twkNbHsIdJXfgkUmPXFQZgtc9bf3XRrRu3AjfmTOAQgH9tCvEGtJ58yA30ih7QgjpBQqhJOFQCI1hvR3MVGOvwY0f3oh8Yz7+vujvEbnjDOcc7oOH0LpxI1o/3gj/uRowpRL6GTOQtGgRDHPmQG7Q9/k4hBAS5yiEkoRDITROXGgwk1/w485P7kR5YznWXrMWBUkFES8D5xzu/fulQPoJ/HV1YGo1DLNmIenqRTDMmgWZVhvx4xJCSBygEEoSDoXQONPdYKb0/C3YUv8mnpnxDL4x7Bv9Xg4uCHDt2SP2If3kEwSsVjCdDsbZs2FctBCGmTMhU6v7vRyEEBIjKISShEMhNI4FBzO9uf9z2FJeArNPwLW5P7qowUx9wQMBOHfuQuvGjbD9+98INDVBptfDMG+u2GQ/bRqY6vx9WQkhJM5RCCUJh0JonGt2N+OGD28A42qMFn6Ofx9s6vVgpkjifj8c/92B1o3/gu2zTRBaWiBLSoJx/nwkLVoE/eVTwJR976tKCCExhkIoSTgUQuMY5xwPbn4Q289tx5tXv4nRaaP7dGemiJfP64XjP/9B6782wrZpEwS7HfLkZBivvBJJVy+CbvJkMDldRoYQkhAohJKEQyE0jr155E2s+HoFHp30KG4tubXT8/11Z6aLIXg8cHz5JVo3fgz75s0QnE7I09KQtOAqJC1aBO2ECWAy2YCWiUQP5xz+egu8JyrhqTzRNg00NUFhNkORkSE9xHllcDk9nWrSSayiEEoSDoXQOFXeWI6lHy3FFTlX4I9z/3je/p/9dWemiyW43bB/sRWtGzfCvmULuNsNhdkM48KFYiC9ZDwF0jjBAwH4qqvDgqbnRCW8lScg2O1t28kMBqiGD4MiNQ3+hgb46+vht1iAQKDTa8rT0toCqjIjAwpzRkholYJrWhrVspPBhkIoSTgUQuOQ0+fEkg1L4PQ5se7adUjRpPR43+7uzPStyQUozTX1Y6m7JjgcsG3ZgtaNG+HYug3c64UiOxtJCxci6epF0JSWDtgAK3LxBI8H3pMn4amshPfEybag6a2qAvd627aTm9OhHjYc6uHDoAqZKjLMnX7OPBBAoKkJ/vp6+OrrxWBab5Gm4sNnqUfA2gB0/J6TyaBIT+++RlV6yJOT4+YfHh4IgHs84AKHTK+jz83gQz8QknAohMahJ7c/iQ+Of4C/XvVXTM6efFGv0dWdmcblmXDLlAJcMz4HOpUiwqW+sIDdDvvmzWj910bYt28HfD4o8/KQtGghjFddBWVeHuRJSVTDFUWB1lYpaJ4QazcrK+E5cQK+6ur2IMgYlPn5UA8dCtXwYNAcBvXw4ZAnJUW8TNzng7+xEf66uvMG1kBzc+edlUoozOlQhtamZma217JKD5nR2ONQxwMBcLcbgscD7vFAcLvBvd6wdeJ6aepxg3u84B7p+Quu7/C82w3B6wV8vrYyMJ0OyuxsKLOyoMjOkuazoczOgiI7G8rsbMg0mkj9CEjPUAglCYdCaJz514l/4dFtj+KecffgB5f+ICKv2eLy4f29Z/HmjlM4VmeHUa3A/16Wi6VTCjAqK/KhoScCLS2wfbYJrR9/DMd//gP4/eITjEFuMkGenAx5Sor4SE6GPCUZirb5lPDnKbj2Snf9NT0nKhGwWNu2Y0olVEOHQjV8WHvt5vDhUBUWDsprxAoeD/wWa1gw9VvqQ4KrGFoFm63TvkyjEQOp2Qwml3cIgOEBse139WIwBqbRQKZWg6nVYBo1ZCo1mEYDplZBptaAqdWQadRgKul5tRpMLT0vBUt/fT1852rgq62Fr7Ym7OcWJE9OhiJHCqdZWVDmZEORlQ1ljhReMzLAFAP/z2gcoxBKEg6F0DhypvUMFm9YjOKUYry+4HUoZJH9A8E5x+5TTXhzx2l8VFYDr1/AhCEpWDq5AP8zLnvA+44G+Zua4PzPf+C3NiDQ3IxAcxP8TU3ifFMzAk1NCDQ1hTX7hmEM8qSkzuG0q/AanDeZ4j649ra/ZsdmdGVeXlyeI8HphN9i6RROgw9wLgVEDWRqVXsAlAJiaDCUaaQwqe4cFtvXSa+lUgFKZb80owteL/x1dfDV1MBfUwNfTS18NTXw1dbAL813Ct8ymdiFoa02NSd8PjsL8tRUavbvOTpRJOHEfQj1CT4oZfE/WtYX8OH2jbfjlO0U1l2zDjmGnH49XpPDi3f3VOOtHadxwuqASavEDZflYemUAhRlGPr12BeDcw7ucol9CEPDaXNwepHBtaua1W7Cq0yrBQQBnHNAEDrPCxzgXc0LYlN2V/Pd7tPD15Ke44IA7vPBd6Y6Iv01SfwJ2B3w19aI4VR6+GtqxdrUmnPw19R2+rwwtRqKrEypqT9bDKihtanZ2ZAbBt/3RZTQh4gknLgOoZ6AB0s+XIJ5Q+bhrrF3QauI3/uW/2HXH7Dq0Co8P/t5zB8yf8COyznHf0804s0dp/DJoVr4AhxThqbilsuHYMGYTKgVsVsT1pPgGmhuFsNrU7O4vrGx++AaCxiDMi8P6mHDwvtrDhsGuWngB6aR2ME5R6CpCb5zNWJYlZr7w+br68V/ukLIjEYxkOa090uVp6VBbjBAZjBCZtCL80YjZAYDZDpd3AwW64BCKEk4cR1CWzwteObrZ/DRiY+Qo8/BssnLMDd/btzV2nx59kvc/9n9uKn4Jjw59cmolcNq92Dtrmq89fUpnGl0IVWvwuKJeVg6uQBD0vRRK9dAaguuHcNpUxMElwtMLgOYDJAx8Q+pTN4+H7q+J/MymdhHsG1eBiZjF5zvbn8mk0GRlUUDUki/4X6/2I2htrY9rHZo+g80NZ3/RRiDTK8XA6lBD7lBCqdGgxhW9SHzXQVZvQFygx5MN+iuEDCoCkPIQIjrEBq0s3YnfrPjNzjefBzTcqfh8cmPY0jSkIgeI1qsLituWH8DUjWpWP0/q6FRRD9ACALHl8eteHPHKXx2pB4BgWPGiHQsnVyA+SWZUMrjshaDEBIBgvRPnGC3I2CzQ3DYIdhsCNjtEOwOCPaQeZsNgsOOQHDebkfAbgd3uS58IJkMMkNoWJXCq97QRagNCbJSqFVmZYldbCKHQihJOAkRQgGxb+jb5W/j5X0vwxvw4ttjvo27x90d0030Ahdw76f3Yl/9Pqz+n9UoSimKdpE6qWt1Y83OM3j769M41+KG2ajGkon5uHlyPvJSBvauTISQxMD9fggOR3uItdsRsNmkEGtvD7I26TlH+3wwyAp2O7jH0+0xcv/4IpKuvDKSxaYQShJOwoTQIIvTgud2P4cNJzYgW5+NZZOWYV7BvMHWLNMjr5W9hpV7VuLnU3+OxcWLo12c8woIHFuO1uOtHaex+Wg9AGB2sRlLpwzBnJFmKOK9dtTZCDgbgNThYjM4IWTQ414vAg4puHaojdVNngxlVlYkDxd7f4QI6aOEC6FBu2p3YfmO5WITfc40PDb5MRSaCvv9uJFywHIAd2y8A3ML5uLZWc/GVIg+2+zCmq9P4+2dZ1Bv8yDbpMGSSfm4eVIBskzR704QUbZaYPuLwK7XAb8L0JiAvElA/hRxmjsB0ETnWquEkEEldr7ECYmQhA2hQNdN9HeNvQs65eBuJrZ5bVj84WJwzrH22rVIUsVmiPEFBGw6Uo83d5zCtgor5DKGeaMysHRKAWaOMEMmi+Hv5JazwPYXgN2rAMEPjLsJKJgKnN0NnPkasJQD4AAYkDlGCqaTxXCaOgyIoX8qCCERQR96knASOoQGhTbRZ+mz8OikRwdtEz3nHMu2LsOnpz7FqoWrcEnGJdEuUkScanBg9ddnsHbXGTQ4vMhP1eLmSQW4aWI+zMbBd3edbjWfAb58Htj7d/HanONvBqb/CEgbHr6duwWo3iUG0uqvxXlPq/icLq09lOZNBnIvA1SJcXUBQgalgB9wNQIOK+C0itOCqUBSdiSPMvj+4BDSzyiEhthdtxvLdyxHRVMFrsi5Ao9PfnzQNdG/V/EenvrqKTx46YO4e9zd0S5OxHn9Aj45VIu3dpzGf040QCFjWDAmC0unFGDqsLTBWzvaVAVsew7Y95a4fOktwPSHgZTCnu0vCGLtaPXXwJmdwJkdQEOF+ByTA1mlYiDNnywG1JRCqi0l5GIFfGIf7dBQGTrvtAKOhvZlVxPElosQS/4BjL4mkqWiDzRJOBRCO/AL/rYmenfALY6iH3v3oGiiP9F8Aks2LMF483i8cuUrkMti90LwPVFpsWP1jtNYt6cazU4fhqbr8a3J+bhxQj5S9apoF0/UUCmGz/2rxWt+XnY7MO0hIDm/76/tbJRqS3eI4fTsHsAr3S5Tn9EeSPOnADmXAMrYvdIDIX3i93YIkB1DpRQ4HRZxnbul69dhMkCbCujTAb1ZbJXQpwO6dGmaJq7XpwPJBZFuoaAQShIOhdBuWF1WPLfrOXx44kNk6bOwbNIyzC+YH7Umek/Ag6UfLYXFacG6a9chQ5cRlXJEg9sXwMaDNXjzv6ex61QTVHIZFo3Nwi1ThmBSYUp0fibWCmDrs0DZO4BcBUz4NjDth0BSP94uNeAH6g+H15Y2nRSfkymBrLFiIM2Xgqkpr//KQkh/8rmlAGkJr5EMC5WW9vlgV5aOmLw9SIaFyXRAnyZNze3rtMniP5PRQSGUJBwKoRcwWJrol/93Od4++jZenvcyZubNHPDjDxZHa214a8cpvLf3LGxuP0ZkGLB0SgGuvzQPJp2y/wtQXw5s/T1w8F1AoQEm3Qlc8SBgzOz/Y3fFbgGqd0rBVKot9UsX6jbmiIE0TxrwlD0OUMRQ/1oS+4SAWOvobBT7VLqa2ued0nLbfCPgahbnfY6uX0+m6LpWsstQmQZokmPpkmgUQknC6VEIZYwtBPACADmAv3LOV3R4fiaAlQDGAbiZc74u5Lk7APxMWvw15/yN8x1rsIVQQGyiX3N0DV7a+1JUmug3nd6Ehz5/CLeV3IZlk5YNyDEHO6fXjw37a/Dm16ex/0wzlHKGoel66WHAMLMew6TlVL2q77WltQfF8Hn4A0CpAybfDUz9PmAwR+YNRUrAB9QdFANpcNBT82nxObkKyL5EGoUvDXqK7MAKEq84F7uCdAqOTecPlu4WdOpLGcRkYkjUpYpN4LpUQJvSPh9acxlsGteY4rkvdNy+MUK6c8EQyhiTAzgG4EoA1QB2AvgW5/xwyDaFAJIAPAJgfTCEMsZSAewCMBHiN9FuABM4593eHHgwhtAgq8uK53c/j/WV6wesib7WUYsb1t+AXEMu/nH1P6CSD5K+kIPIwbMt2HCgBsfr7ThpteN0oxO+QPvvdZJGgWFmQ1soHWY2tAVWreoCTW81+4EvfgeUbwBURmDKvcDl3xNrXWKFrbY9kJ75Gji3DwhId4IxFYTUlk4SLw+lSY7nP/TE7+l5jWRo7aXg6/41VUZAlxIeIrVSqOwUMqV1alMs1VIOBPrQkYTTkxA6FcDTnPMF0vLjAMA5f6aLbVcB2BASQr8FYDbn/F5p+RUAWzjnq7s73mAOoUF76vZg+Y7lONZ0DFOzp+LxKY9jqGloxI/jF/y485M7Ud5YjneueSdu7nff3/wBAdVNLpy0OnDC6sBJqx0nLA6ctDpQ0+IO2zbHpMFQc+ca1FznESi2PQsc2yj+sbz8fjGA6lKj9K4iyO8BasvEPqVnvhab81vPtj+v0Ip9W5NyAGO2WFualCvNS+v1GYBcEb33EE84F2uw/S7AF/Lwu8S+kW3zHZ/ratl9/ud8ToAHui+LXBUeIjsFyy5CpjYFUNA/xxFAIZQknJ78FckFcCZkuRrAlB6+flf75vZw30HrsszLsOYba9qa6K9ffz3uKLkD94y7J6JN9H858Bfsqd+D30z/DQXQXlDIZShM16MwXY85HZ5zev2osjpxwmrHSSmYVlod+GDfOdjcflzGjuEHin9iiHw/WmHA5pTv4MTQW5Cjz8SwOmBougfphgg070eTQg3kTRQfUx8Q17VUixfSbz4D2GrEUNpaA5z5rzjtWAvGZIAhMzyYhs3niOE1Hq9v6nOJNYXu5vapu6V93mvvOjz6pRDY1XPnC4bno9CID6UOUEpThUa8UoIhQ5wqtOHPqXTdB0uljmrBCSEDZlBUZTDG7gFwDwAUFBREuTQ9o5ApcMvoW7CgcAGe3/08Xjv4Gjac2IBlk5bhyiFX9jmk7KzdiVcOvIJrhl2Da4ZH9Fp0CU2nUqAkJwklOeF3meJV2+H7/LdQnfoCHmUytmR9D/9ULMThBuDUDiu8gfq2bY0aRVvT/tB0A4aG9D/VqwfFR6r3THndj6YXBHEEsu2cGEhbz0pBtUZc13AcOLkN8HRx2RuNqT2QGqWAGjafI/b1G8jgw7kY/EJDZG+mwa4M3QmGQIVWnAYfCg1gyOocFsO2DX2uY7CUlhUhr0fN2YSQGEbN8RGyt34vlv93OY42HcXl2Zfj8SmPY5hp2EW9VrO7GTd8eAM0cg3eueYd6JVxWJs0GHAOVG0T+3xWbRMHP1zxIDDxu4Da0LZZQOA41+zCCasDJyx2nLSKNagnLA6cbXaFvWRmkhrDQoLpMLMYVPNStFDK4zwweB0dQuq59mlw3l4n3kkqlFwFGLPCm/w71rAas8ObfDkXj3cxQdLdAgS8538vahOgNYn9Y7XJXU81Jmk+JXwddVMgF4eqoEnC6UkIVUAcmDQPwFmIA5OWcs4PdbHtKoSH0FSIg5EukzbZA3FgUmN3x4vVEAq0j6J/ee/LcAVcuL3kdtw77t5eNdFzzvHg5w/iy7Nf4h9X/wNj0sb0Y4kTFOfAiS1i+Dz9ldisPO0h8Vqfqt51p3D7AqhqcOCkxSGFVLEP6kmrA03O9iZshYxhSJoOI7OMGJFhxMgsI4ozjShM00ER7+E0VMAvBtGwkCo1/YcGVr+r8756M6BOEkOkuxkQ/Oc5EAsJiT0Jkh3WxfmNIMigRCGUJJyeXqLpaoiXYJIDeJ1zvpwx9ksAuzjn6xljkwD8E0AKADeAWs75GGnf7wL4qfRSyznnfzvfsWI5hAZZXVas3L0SH1R+gExdJn4y6Se4ashVPWqif+vIW3jm62fwk4k/we1jbh+A0iYQzoHjm4AvfiuOFDfmiLfWvOy2frnbUJPDKw2MEmtQj9fbUVFvR1WDA8GPnUouwzCzvi2Uig8D8lN0g/cWpf2NczFkBpv7W8+1z3tsYkhsC43d1Faqk6ipmsSaBP3Ak0RGF6vvR/vq92H5juUobyzvURP90cajWPrRUkzJnoKX570c24NfBhPOgWMfi+Hz3F7AlC+Gz0tvjcrF213eACotdhyrs+FonQ3Ham04VmcPa9rXKuUYkWmQak0NbQE126Sh3wtC4hN9sEnCoRDaz/yCH+8cfQcv7X0JroALt5XchvvG3depid7pc2LJhiVw+BxYd+06pGri4FJA0SYIwNGPxGb32gNA8hBg5iPAuJsH5SVlbG4fKurtqKiz4WitGFKP1dlQb2sfCGNUK1CcJdaWttecGmN/xH4Hbl8AjQ4vrHYPGuzi1Gr3wun1Q62QQa2QQ6MUp+qQqabDVK2QQaOUt00VMhZX54nEFfrFJAmHQugAaXA14Pndz+ODyg+QocvATyb9BAuGLGj7g/jz7T/H+8ffx6tXvYop2T29AhbpkiAARz4Avvg9UH9IvAD7jEeAcTcB8gG4tWeENTm8YiCtt0u1pmINanNIn9NUvQojMgydmvWTdYMjbHPO0eLywWr3okEKlA0OcSoGTTFsNji8sNo8sHm67u/JGNCXrywZQ1iA7TjtTaDtGIaD0zSDCplJGsgTtTsFuVj0C0MSDoXQARbaRD8lewp+OvmnONp0FMu2LsPdY+/Gg5c9GO0ixi4hABz6p3h7TUs5kF4MzPwJMOb6uBuxzDmHxe5BRZ0dR6VgKj7ssIcEuMwkdVgoLc40YkSmEYYIXErK4xdrKxvsXliCIdLuaa+9lAJlg0Nc9gudv2sYA1J0KqQbVEjTq5FmUCHdoBaXDWqkG6R1ejXSjSpolXL4BQ63LwCPX4DHL4jzPgFuf/jUEzJ1B5eD23cxPf9rCPAGhC7OQtcUMoYskwa5yVrkpmiRJ01zk3XITdEi26SBRkmDn0gYCqEk4VAIjYKAEMA7x97BH/f8ES6/E0qZAsUpI/G3RauglMVeTV1UcS5eaP3kVuDL54GGCsA8Gpj1E6Dkmwk3yplzjnMtbjGQ1oo1phV1dlTU2+D2tYeo3GRtSK2pGE6LMgzw+IX2mkq7p1OQDG0eb3V3XVupVsg6BElxmqZXwWxUh4XNFJ0yZq4OEBA4vKGhNSTcBqcubwAWuwdnm1w42+zC2SYXzjW7UNvqRscMbjaqkZMcGlC1baE1N0WLJA19FyQYCqEk4VAIjQafG6jcjIaDa7HSsh27lTK8WmdFrrEAMI8CzCPbH+nF8XnXmd4SBKDlNGA5KtZyWo4C9UcA6zHxDjUAkFkKzFoGjLqGRkZ3EBA4qpucIbWmYp/TSosdvsCFvwNSdMqwQJmu71BTKYXKNIMaepWc+l124AsIqG1xtwXTsKn08PrDa1qNGkV4MO0wNRvUdJ7jC/0wScKhEDpQfC7x8kCH3weOfgx4beKlZEZ/AyiYCjSfbg9XDZXht0k0FYQE01Ht4VSbHL3301+EANBUFR42LeVi2PQ527czZLWfj4xRQMYYIG8Shc9e8gUEnGpw4GitHScsdmhV8rZayjS9GDpT9Kr4v9B+lHHOYbV7Q8Kpsy2kVktTW4eaZ5VC1hZSc5I1bU39ucla5KVokWXS0M8ttlAIJQmHQmh/8jqB45+JwfPYJ2KNnTZVDJ4l1wFDZ3U9UCbgAxpPtocwazCIVYj3nw4yZkuBtENA1acP3Hu8WAE/0BR8j6Fhs8N7TMoNeW/BR7F4v2tCEkir24dzHWpSq0OWLbbw24nKGJCZ1N4vNSe5PaAOSzcgN0VLg6cGF/phkIRDITTSvA6g4lMpeP4b8DnEe2OPvkYMnoUzLn6EthAAmk8BlmOdawmDTdKAeLy2GtOQgGrMGth7dANioG6oDC+v5ajYdzP01okda3szRou1vZqk7l+bENLG4w+gptndRUB14myzCzXN7rDBYSq5DAVpOgxNF28xO1R6DDMb4u6SXzGCTjhJOBRCI8FjByr+LQbPik/FZmO9uT14Dpnev6OzORdvfWgp7xxQ3c3t26mTwoNeMKCa8vvejO33SGHzSIeweTzk9ooMSBkS0u9VqtlMLw67VzshJPICAofF5sGZJqd0F6/2W8xWNTjD+qQa1QoMNbcHUzGoGlCYroORBkz1FwqhJOFQCL1YHpvYxH74faDiM/Fe1/oMoORaKXhOi/7IbM4Bh6VzLaTlKOCob99OqRODYFi/01Hixd07hmefW6zFrC8Pb0pvPAHwgLgNkwEpQzuEzeAgq97dm50Q0v8CAse5ZhdOSreZPWl1oNIiBtSzza6wa7OajWoMTddjeFtINWBouh4FqTqoFNQHtQ8ohJKEQyG0N9yt4u0fD38g9vX0u8UBMsHgWTA1+sGzp5yNYjN+WEA9BrRWt28jVwFpI8Q+mH6PuE1TFcClGhMmB9KGd+izORJIK+qXe7ETQgae2xfA6UanVHPaXnt60uqA1d7epUbGgPxUXUjNqRhQh5n1yErSQEb9Ty+EThBJOBRCL8TdAhzdKAXPTUDAIw4IKrlOfORfHl8jst2t4uAgS7k0IEoKqApNeNA0jxIDaBTuvU4IGRxaXL72YGpx4ERITarTG2jbTqOUoTBNj2Edak+HpeuRoh8cd/UaBCiEkoRDIbQrrmbg6L/E4Fm5WRxAk5TbHjzzJsdX8CQ8BU8/AAAJJklEQVSEkAjinKPe5sEJiwMnpIAaDKenG51hA6SSdcq2PqfBkJqXokWGUYN0gypmbmYQARRCScKhEBrkbAwJnp+L1+k05bcHz9yJFDwJIaSPfAEB1U0unLTa25r4g9PaVnfYtowBaXo1MoxqZCapkWHUICNJXM5I0rRNzQZ1PPRHpRBKEk5ih1BnI1C+QQyeJ7aIo7hNBcCY68RbPuZOGPhLGhFCSIJyePw4aXWgpsWNulY36m0eWGxu1LV6UG9zo77VA6vd0+kWqACQqleFh1NjMLwGg6sGZqMaGuWg7bdPf2xIwunH6wYNUo4GoPxD4ND74v3GeUAcBT71ATF45lxKwZMQQqJAr1agNNeE0lxTt9sEBI4Guwf1NjGY1rV6UN/aPm+xuVFRZ4PF5glr9g8yaZXt4dSohlkKqG01rUY1MpLU0KkS788jIQMtMT5ldkt78Kz6UgyeKUOBaQ+KwTN7PAVPQgiJAXIZE2s7kzQAug+rgsDR6PS2BdS2qc3TVsu646QDFpsH3oDQaX+jWiEF1PbA2t4dQIPR2UYk62hQFSF9Ed8h1NkIvHM7cGq7eFmh1OHA9IfE4Jk1loInIYTEKZmMId2gRrpBjRJ0f+c1zjmanb7wmtXQ0Nrqwd7TzahrdcMTckH/V26bgAVjsgbirRASt+I7hGpTAJkCmPFjcXBRZikFT0IIIW0YY0jRq5CiV2FklrHb7TjnaHX7YZGC6fm2JYT0THyHUMaA29+PdikIIYTEOMYYTFolTFolijIogBISCTF/TQtCCCGEEBJ7KIQSQgghhJABRyGUEEIIIYQMOAqhhBBCCCFkwFEIJYQQQgghA45CKCGEEEIIGXAUQgkhhBBCyICjEEoIIYQQQgYchVBCCCGEEDLgKIQSQgghhJABRyGUEEIIIYQMOAqhhBBCCCFkwFEIJYQQQgghA45xzqNdhjCMMQuAUxF+2XQA1gi/ZqKhc9h3dA77js5h39E5jIxIn0cr53xhBF+PkEFv0IXQ/sAY28U5nxjtcsQyOod9R+ew7+gc9h2dw8ig80hI31FzPCGEEEIIGXAUQgkhhBBCyIBLlBD6l2gXIA7QOew7Ood9R+ew7+gcRgadR0L6KCH6hBJCCCGEkMElUWpCCSGEEELIIBLXIZQxtpAxdpQxdpwx9li0yxMrGGOvM8bqGWMHQ9alMsY+ZYxVSNOUaJZxsGOM5TPGPmeMHWaMHWKM/VBaT+exhxhjGsbY14yx/dI5/IW0fihjbIf0uV7DGFNFu6yDHWNMzhjbyxjbIC3TOewFxlgVY6yMMbaPMbZLWkefZUL6KG5DKGNMDuBlAIsAlAD4FmOsJLqlihmrAHS8Xt1jADZxzkcA2CQtk+75AfyYc14C4HIAD0i/f3Qee84DYC7nfDyASwAsZIxdDuC3AJ7nnBcBaAJwZxTLGCt+COBIyDKdw96bwzm/JOSyTPRZJqSP4jaEApgM4Djn/ATn3AvgbQDXRblMMYFzvhVAY4fV1wF4Q5p/A8A3B7RQMYZzXsM53yPN2yAGgFzQeewxLrJLi0rpwQHMBbBOWk/n8AIYY3kA/gfAX6VlBjqHkUCfZUL6KJ5DaC6AMyHL1dI6cnEyOec10nwtgMxoFiaWMMYKAVwKYAfoPPaK1Iy8D0A9gE8BVAJo5pz7pU3oc31hKwEsAyBIy2mgc9hbHMC/GWO7GWP3SOvos0xIHymiXQASezjnnDFGl1XoAcaYAcC7AB7inLeKlVAiOo8XxjkPALiEMZYM4J8ARkW5SDGFMfYNAPWc892MsdnRLk8Mm845P8sYywDwKWOsPPRJ+iwTcnHiuSb0LID8kOU8aR25OHWMsWwAkKb1US7PoMcYU0IMoG9yzt+TVtN5vAic82YAnwOYCiCZMRb8B5o+1+c3DcC1jLEqiF2S5gJ4AXQOe4Vzflaa1kP8Z2gy6LNMSJ/FcwjdCWCENApUBeBmAOujXKZYth7AHdL8HQA+iGJZBj2p391rAI5wzp8LeYrOYw8xxsxSDSgYY1oAV0LsW/s5gBulzegcngfn/HHOeR7nvBDid+BmzvktoHPYY4wxPWPMGJwHcBWAg6DPMiF9FtcXq2eMXQ2xP5QcwOuc8+VRLlJMYIytBjAbQDqAOgBPAXgfwP9v735CrCrjMI5/HyxSIoKkpSJCERRlgaBhrsRVC2mRYNGiRX9AhSBCXAaC4KptKzcqSKK4SlfWZIhDNo79XUUQUgiKlELE9Gtx3sHjOEMjDkfnzvezubzvuee95x64l+e+7zn3dxRYDfwKvF5VM29eUpNkEzAGXOLWtXh76a4L9TzOQ5Ln6W74WEb3g/loVX2cZC3drN4TwLfAm1X19/070sWhLcd/WFWveg7nr52r4635EHC4qvYlWYmfZemejHQIlSRJ0oNplJfjJUmS9IAyhEqSJGlwhlBJkiQNzhAqSZKkwRlCJUmSNDhDqLSIJfmrPa5JsmOBx947o/31Qo4vSVraDKHSaFgD3FUI7VXMmcttIbSqXr7LY5IkaU6GUGk07AdeSTKR5IMky5IcSDKeZDLJu9D9YXmSsSQngR9a34kk3yT5Psk7rW8/sKKNd6j1Tc+6po39XZJLSbb3xj6T5LMkPyU51CpHSZJ0h/+bCZG0OOyhVcMBaGHyelWtT/IIcDbJ6fbcl4DnquqX1n67qq620pjjSY5V1Z4kO6tq3Syv9RqwDniBrqrWeJIv27YXgWeBy8BZutrlXy3825UkLXbOhEqjaSvwVpIJulKhK4Gn2rbzvQAKsDvJReAcsKr3vLlsAo5U1VRV/QF8Aazvjf1bVf0LTNBdJiBJ0h2cCZVGU4BdVXXqts6ufviNGe0twMaqupnkDLD8Hl63X398Cr9jJElzcCZUGg1/Ao/12qeA95M8DJDk6SSPzrLf48C1FkCfATb0tv0zvf8MY8D2dt3pk8Bm4PyCvAtJ0pLhLIU0GiaBqbasfhD4hG4p/EK7OegKsG2W/T4H3kvyI/Az3ZL8tE+BySQXquqNXv9xYCNwESjgo6r6vYVYSZLmJVV1v49BkiRJS4zL8ZIkSRqcIVSSJEmDM4RKkiRpcIZQSZIkDc4QKkmSpMEZQiVJkjQ4Q6gkSZIGZwiVJEnS4P4DdmWduY3kTqcAAAAASUVORK5CYII=\n", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plot_feature_importance(trial, importance_type=\"cover\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### SHAP\n", + "\n", + "[SHAP](https://github.com/slundberg/shap) (SHapley Additive exPlanations) is\n", + "another approach to explain the output of machine learning models.\n", + "SHAP values represent a feature's contribution to a change in the model output.\n", + "You instructed Estimator to log the average SHAP values in this example so the SHAP values (as calculaged by [xgboost.predict(pred_contribs=True)](https://xgboost.readthedocs.io/en/latest/python/python_api.html#xgboost.Booster.predict)) will be available the `average_shap` collection." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAmgAAAF3CAYAAAARh7eaAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzs3XlYVFe+L/zv3jVPFBTFTAkCTjiggiZorjlpNS2Z2jj00ZCb2OnY8SYkfRtjP/dkOP2+3tPvq+mO3VdPYkz6JGlzNMMxSbfBKQ7dmDYxSlQMIgooQiEzVUXNVbv2un9UUQLiFNEC+X2ep1J7ql2LMsCX39p7LY4xBkIIIYQQMnjwkW4AIYQQQgjpjQIaIYQQQsggQwGNEEIIIWSQoYBGCCGEEDLIUEAjhBBCCBlkKKARQgghhAwyFNAIIYQQQgYZCmiEEEIIIYMMBTRCCCGEkEGGAhohhBBCyCAjjXQDrmTevHls9+7dkW4GIYQQcjtxkW4AGRwGbQWtvb090k0ghBBCCImIQRvQCCGEEEKGKwpohBBCCCGDDAU0QgghhJBBhgIaIYQQQsggQwGNEEIIIWSQoYBGCCGEEDLIUEAjhBBCCBlkBiSgcRw3j+O4MxzH1XAc97/62b+C47jvOY47wXHcPziOyx6I9yWEEEIIuRPddEDjOE4C4A0ABQCyASztJ4BtZYxNZIxNBvAagHU3+76EEEIIIXeqgaigTQdQwxg7xxjzAfgIwE96HsAY6+qxqgHABuB9CSGEEELuSAMxF2cKgIYe62YAd/U9iOO45wAUA5AD+NEAvC8hhBBCyB3ptk2Wzhh7A8AbHMc9BuAVAE/2PYbjuF8A+AUAjBgx4nY1jRBCyCDARAbRLUB0+iE6fGACA3iA43lAwoGTcAAfepbw4HguuD28jbt0LE9zjpOhbSACWiMAU4/11NC2K/kIwMb+djDG3gbwNgDk5eVRNyghhAxxzC8iEApcAYcfosOPgMMH0eGH6Ly0HAitQxygH/0cgmGuO8RJOIDnewS5y8Pe1fb1Coh8aLuEg3pyHGQJmoFpMyE9DERAOwpgFMdxIxEMZksAPNbzAI7jRjHGqkOrDwKoBiGEkCGHMQbmFnqHLWcoYPUIYaLTj4DdB+YN9HseTsaD18kh0cggiVZAnqoDr5WB18gg0cnAa+TgZDwQYGCiGHwOMEDs/dzvvoAIJvZcZ8H1AAMLiMHtl+0Tg+uCCDF8bPc5xEvHdb9vgAGiCLlJRwGN3BI3HdAYYwLHcUUA9gCQAHiXMXaK47jVAMoYY9sBFHEcNweAH4AF/XRvEkIIiQwmdFe5elS3eiwHnH6Idl/4mH6rXBzAq2XgtTJItDLIUrRQamTgdTJINPJg+NLKINGGluWS2/+FEjKEcIwNzp7EvLw8VlZWFulmEELIkMYYg2j3Q+h0Q+jwQOj0INDpgWDxhLsWmUfo/8VSHhKtLFzp6hmwJD0DlyZY+aLrvgYEfYgEwG28SYAQQsitwfwiBEsofHW4IXR6wo9ApwfML146mAMkegUkMQrIkjVQdgesy8KXHJycB8dRXiAkEiigEULIIMcYg+j0X6p+dXQHMDcCnR4Euny9RpfkZDyksUpIY1VQjooJLhuUkBiUkMYowUlplj9CBjsKaIQQMgiwgIiAxRsOXsFq2KVKWN+L7fkoOaQGJRSZ0cHwFauC1BAMYrxWRpUvQoY4CmiEEHKbiG4BQo8uyEB3V2SHGwGrt/ccK1IuFLhUUIzUB6tfBiWksUpIYpR0kT0hdzgKaISQQY8xBuYXwbwBMF8gfE1V+B6n7gXW60WXbet1U1Tf13ZvY5dWLp2/xwG93qP3OVj4HD0uzA+HMA+Yu/fF+LxGBmmsEvK0KEinBMNYdwjjdXK66J6QYYwCGiFkQPUNU2J3qPIGIPoCYF4xuD20Lby91zEBMJ946bW+wNCcwZfnII1RQBKrgjpVF+6ClHSHMAX9CCaE9I9+OhBCgoN2eoTgOFcuAcwj3J4wxQGcQgJeLgGnkISX+ShF7+1yHrxCAk4eesj40GAEHHpdasWF/3NpsILw86Xt4ZdwfY69wjkuvUfoxb2Ov/wcHMcFB1yNVlAVjBDyg1BAI+QO013BEp3+8CPgEnqtB7f5ITpD213+a4cqHuDkVwlT4QDVO0zxiu6Q1eOY0Gsh5ehidkII6QcFNEIGORYQIYYCVqA7YPUIV4HwenBbwOkHBLH/k/Gh0d41MvBqGWTxKvCaqPC6RCMDr5aCU0opTBFCSARRQCPkNgrPY9i3ouXyI+AUeq0Hw5dw5VHeAXBKSbArTSODJEoBWZIWvEYaClqy8AjvvEYGSSh4UZcbIYQMfhTQCLlFmMggdLjhb3TAd9ERenZedidfmJTrFaxkMcrgXIZqae+g1b2sktKAo4QQcoeigEbIAGABEf7WYBjzX3TA1+iAv8kZvGAeAKQcZIkaqCcaIY1T9Q5aoUBG0+oQQgjpRgGNkBvE/CL8zc4eVTEH/M1OQAheZc/JeciStNDkJUCWrIUsRQtZvAqchKpdhBBCrg8FNEKuQvQK8Dc5gxWxi85ghazVCYSuweeUUshTNNDOSIY8FMaksSq6zosQQshNoYBGSIjo8oeqYqHq2EUHhHZ3ePgJXiuDPEUL5TgD5ClayJK1kMQoqFuSEELIgKOARoalgN0Xqopd6qYMWLzh/ZJoBWTJWqgnx0OWooU8WQtJlDyCLSaEEDKcUEAjdzTGGAJW76UL9y8GuytFuy98jNSogtykg/zupOA1Y8laSDSyCLaaEELIcEcBjdxRhA43fOZLw1r4LzogukLDWnCANF4N5ahoyJKDVTFZsga8kr4NCCGEDC70m4ncEZjI0PVlHex/Nwc3SILDWqgmGCFL1gQrY4ma4Ij4hBBCyCBHAY0MeaIvAMsnZ+Cu6IBmWiI0+UmQxatpEFdCCCFDFgU0MqQF7D60//kU/I0O6B8cCe09KXRXJSGEkCGPAhoZsvzNTrS/fwqi04/Yx7OhGh8b6SYRQgghA4ICGhmSPGc60bG1CpxCgrgVOZCnaCPdJEIIIWTAUEAjQ47jm4uwbq+FLFGD2GXjIdUrIt0kQgghZEBRQCNDBhMZbCXn4Pj6IpTjDDAsGQteQXdlEkIIufNQQCNDgugV0PnhGXiqOqGdmQz9gxk03yUhhJA7FgU0MugJVi86/nwK/hYnoudnQnt3cqSbRAghhNxSFNDIoOYz29H+50owXwDGZROgHB0T6SYRQgghtxwFNDJouU+1o/OjM+A1MsT9jxzIEjWRbhIhhBByW1BAI4MOYwyOrxph23Ue8lQdYp/IhkQnj3SzCCGEkNuGAhoZVFhAhPWvtXAeaYZqohGGn44GJ6M7NQkhhAwvFNDIoCG6BXRsOQ1vjRW6+0yImptGd2oSQggZliigkUFB6HCj/f1TEDo9iFk0Gpq8hEg3iRBCCIkYCmgk4rwXutCx+RSYCBifmgBlZnSkm0QIIYREFAU0ElGuE63o3HYWUr0CscvGQxanjnSTCCGEkIijgEYigjEG+/56dO2rhzw9CrH/PRsSjSzSzSKEEEIGBQpo5LZjggjLp9VwHW+Femo8YhaMAiflI90sQgghZNAYkN+KHMfN4zjuDMdxNRzH/a9+9hdzHFfJcdxJjuP2cxyXNhDvS4aegNOPtj99D9fxVkTdn4aYxaMpnBFCCCF93PRvRo7jJADeAFAAIBvAUo7jsvscdhxAHmNsEoBtAF672fclQ4+/1YXWN0/AZ7bDsHQson40AhxHw2gQQgghfQ1E6WI6gBrG2DnGmA/ARwB+0vMAxtjfGGOu0OphAKkD8L5kCPHUWNH6ZjmYJ4C45ZOgzomLdJMIIYSQQWsgrkFLAdDQY90M4K6rHP9zALsG4H3JEOE82gzL5zWQGlUwLhsPqUEZ6SYRQgghg9ptvUmA47jHAeQBuPcK+38B4BcAMGLEiNvYMnIrMJGha08d7KVmKEZFI7ZwHHgl3ZdCCCGEXMtA/LZsBGDqsZ4a2tYLx3FzALwM4F7GmLe/EzHG3gbwNgDk5eWxAWgbiRDRF4DlkzNwV3RAc1cioh/JBCehmwEIIYSQ6zEQAe0ogFEcx41EMJgtAfBYzwM4jpsCYBOAeYyx1gF4TzKIBew+tP/5FPyNDugfHAntPSl0MwAhhBByA246oDHGBI7jigDsASAB8C5j7BTHcasBlDHGtgP4HQAtgP8K/aKuZ4w9crPvTQYff7MT7e+fguj0I/bxbKjGx0a6SYQQQsiQwzE2OHsS8/LyWFlZWaSbQW6A+0wnOrdWgVNIYHxyPOQp2kg3iRBChhrqbiAAaCYBMkAc31yEdXstZIkaxC4bD6leEekmEUIIIUMWBTRyU5jIYCs5B8fXF6EcZ4BhyVjwCkmkm0UIIYQMaRTQyA8megV0fngGnqpOaO9Jgf6BkeB4qs4TQgghN4sCGvlBBKsXHe+fgr/Viej5mdDenRzpJhFCCCF3DApo5Ib5zHa0//kUmE+EcdkEKEfHRLpJhBBCyB2FAhq5Ie6KdnR+fAa8Roa4/zERskRNpJtECCGE3HEooA0SLCDCU20F8wcAhuADLLzMGLv69us9FgAYA0SElxkLbQsdjtA66z42tE10C3Ada4E8VYfYJ7Ih0clv9cdCCCGEDEsU0AYJy+c1cJW1RLYRXOgBLrzMcVyvberJ8YhZkAVORndqEkIIIbcKBbRBwFNrhausBdoZyVBPSwTXHZTC4ahHUOqeMokHAC50LNcjXAXXL9/OhQJXz/P2CWKEEEIIGRQooEUY8wdg/awaklgl9AXpVJkihBBCSLAOQyKna38DhA4PYh4dReGMEEIIIQAooEWU76ID9oMNUOcmQJkVHenmEEIIIWSQoIAWIUxksHxWDV4tQ/SDIyPdHEIIIYQMIhTQIsTx9UX4zQ5EP5wJXi2LdHMIIYQQMohQQIsAodODrj11UI41QDXJGOnmEEIIIWSQoYB2mzHGYPlLDcBxiJ6fScNbEEIIIeQyFNBuM3d5G7xnLYj6cRqk0cpIN4cQQgghgxAFtNso4PTD+kUt5CYdtPnJkW4OIYQQQgYpCmi3kW3HOYjuAGIWjgLHU9cmIYQQQvpHAe028VRb4DrWCt29qZAlaiLdHEIIIYQMYhTQbgPRF4Dl8xpIjSpE/WhEpJtDCCGEkEGOAtpt0LXvAgKdHsQsGAVORh85IYQQQq6O0sIt5mt0wPFVIzTTE6HI0Ee6OYQQQggZAiig3UIswGD59Cx4rQz6AprOiRBCCCHXhwLaLeT4RyP8F52IfiQLvEoa6eYQQgghZIiggHaLCB1udO27AGV2LFQTYiPdHEIIIYQMIVTWuQUYY7B8XgPwHGJ+QtM5EUII+eG+++67eKlU+icAE0CFlTuFCKBCEISnc3NzW/s7gALaLeA61gpvjRXR8zMh0Ssi3RxCCCFDmFQq/VNiYuK4uLg4C8/zLNLtITdPFEWura0tu7m5+U8AHunvGEriAyzg8MG24xzkaVHQTE+KdHMIIYQMfRPi4uK6KJzdOXieZ3FxcTYEq6L9H3Mb2zMsWL84B9EbQMyCLJrOiRBCyEDgKZzdeUL/plfMYRTQBpD7TCfc5W2Ius8EWQJN50QIIYSQH4YC2gARvQFYP6+BNF4F3T+ZIt0cQgghhPShVqunDOT5Zs2aNaq2tla2e/dubVZW1vixY8dmOxwObsOGDbFpaWkT0tLSJmzYsOEHDeVAAW2AdH1Zh4DNi5iFo8FJ6WMlhBBCrkQQhEg34aY5HA7OYrFIMzMz/Zs3bzYUFxc3VVVVVTqdTn7t2rXJR44cOV1WVnZ67dq1yW1tbZIbPT/dxTkAfA12OL6+CM1dSVCkRUW6OYQQQu5Qq7aVm84229UDec7RiTrX7xblNFztmDlz5mQ2NTXJvV4vv2LFihZRFLna2lrFpk2bzACwfv362LKyMs3mzZvr33zzTcPGjRsT/H4/N3XqVOfmzZsvSKVSqNXqKYWFhW0HDx6MWr9+ff3evXt1u3fvjvZ6vXxeXp5jy5YtF3ieR2lpqXr58uXpPM/j3nvv7Tpw4IC+urr6lCAIeO6551IPHTqk8/l83PLly1tXrVrV3l97L1y4IFu4cGGGw+GQBAIBbsOGDRfmzZvnAIDnn38+5csvv9QrlUqxpKSkxmQyCVu3btWvWbMmye/38zExMcLHH398zmQyCcXFxcnnzp1T1NXVKSwWi/SFF15oXrlyZTsA7Ny5Uzdz5kz7unXrjDt27DCUlpbqd+/erX/wwQdts2bN6kpISAgAwKxZs7o+++wz/TPPPNN5I/8uVOq5SSwgwvLpWUh0cujnpUe6OYQQQsiA27JlS92pU6dOnzhxonLTpk0JS5cutezatSu6e/+2bdsMhYWFnceOHVNu27bNUFZWVlVVVVXJ8zx76623YgHA7Xbzd911l/PMmTOVP/7xjx2rVq1qraioOF1dXX3K7XbzH330kR4Ann766ZFvvvnmhaqqqkqJRBK+OeKPf/yjUa/XByoqKk6Xl5ef/vOf/xxXVVUl76+97777rmH27Nm2qqqqytOnT5+66667XN1tyM/Pd5w5c6YyPz/fsWHDhjgAmDt3ruPEiRNVp0+frly0aFHn6tWrE7vPdfr0adU//vGPM4cPH6763e9+l1xXVycDgJ07d+ofeOABW3FxcfucOXOs//Zv/2bevn37+cbGRllqaqqv+/UpKSm+xsZG2Y1+5lRBu0n2g2b4m12IfSIbvJI+TkIIIIoiAIDn6W9gMrCuVem6VdauXZuwY8eOaABobm6WnT17VmEymbz79+/XjB8/3lNbW6ucO3euY82aNXEVFRXqnJyccQDg8Xj4+Ph4AQAkEgmWLVtm6T7nrl27dOvWrUv0eDy81WqVZmdnu9vb2x1Op5OfM2eOEwCefPLJzr1790YDwL59+6KqqqrU27dvjwEAu90uqaysVI4dO9bXt713332385lnnkn3+/38okWLLDNmzHADgEwmY0uWLLEBQG5urnPfvn1RAHD+/Hn5/PnzU9va2mQ+n483mUze7nMVFBRYtVot02q1Qn5+ftdXX32lSU9Ptx49elT71ltvmW/NJ04B7ab421zo2l8P1UQjVNk0nRMhBDh9+jS++OILeL1exMTEIDY2FgaDodezTqej8EaGjJKSEl1paamurKysSqfTidOnTx/jdrv5xYsXd3744YcxY8eO9RQUFFh4ngdjjFu8eHHHG2+80dj3PHK5XJRKg7HD5XJxK1euTPv2228rs7Ky/MXFxckej+eq3xSMMe7111+vX7hwYde12lxQUOA4ePDgmU8//VT/1FNPjSwqKmopKirqkEqlrPt7TyqVQhAEDgCKiopG/PKXv2wuLCy0lZSU6FavXp3cfa6+swFxHIfKykp5UlKST6lUXjb8SUpKir+0tFTXvd7Y2Ci/99577ddqc18DEtA4jpsH4P8AkAD4E2NsTZ/9swD8EcAkAEsYY9sG4n0jiYkMls9qwEl5RD+cGenmEEIizOv1Ys+ePTh27BiSkpKQkZGBjo4OdHZ2ora2ttdF0VKpFAaDIfzoG95oejgymFitVolerw/odDrx+PHjyvLycg0AFBYWWqdOnZr0/fff+9asWWMGgHnz5nUtWLAg66WXXmpJSUkRWlpaJDabTTJ69OheVS6Xy8UDQGJiomCz2fgvvvgi5uGHH7YYjcaARqMRDxw4oPnRj37k/OCDDwzdr5k7d65t48aNcQ899JBdoVCwkydPKtLT0/1RUVFi3zafPXtWnpGR4Vu5cmW71+vljh07pgbQcaWv0W63S0aMGOEHgPfff79XxWXXrl3Rv/3tb5u6urr4w4cP6/7whz80fvjhh9H3339/v0Fx/vz5ttWrV6d03xhQWloa9Yc//OGGK203HdA4jpMAeAPAXABmAEc5jtvOGKvscVg9gGUAXrzZ9xssXGUt8J23IWbBKEii+u0CJ4QME2azGZ999hk6Oztxzz334J/+6Z/QXSkAgl2edrs9HNi6n9vb21FdXY1AIBA+ViaTXRbcupe1Wi2FN3LbLVy40Pb222/HZWRkjM/IyPDk5OQ4ASAuLi6QlZXlqa6uVt13330uAMjNzfW88sorjbNnzx4tiiJkMhlbv359fd+AZjQaA4WFhW3jxo0bHxcXJ3SfEwA2bdpUt2LFijSe55Gfn2/X6XQBAPjVr37VXldXp5g4ceI4xhhnMBj8O3furO2vzXv27NGtX78+USqVMrVaHdiyZcv5q32NL7/88sWlS5dm6vV64Z577rHX19eH52kcN26ca8aMGWMsFov0xRdfbEpPT/fv3btXv3Hjxvr+zpWQkBBYtWrVxdzc3HEA8Otf//pi9w0DN4Jj7OYGJ+Y4Lh/A/8MY+3Fo/V8AgDH2//dz7PsASq6ngpaXl8fKyspuqm23SsDuQ/Pr30GWpEHcLybSD0xChqlAIIB//OMf+Pvf/46oqCg8+uijSE9Pv6FziKKIrq6uy8JbR0cHLBZL+Ho2AJDL5f1W3QwGAzQaDf0sujNc9o9YXl5el5OT0+/dincim83G6/V6EQBeeumlxKamJtl7770XkWvviouLk7VabWD16tUt3dvcbjc3bdq0sRUVFadv9vzl5eXGnJyc9P72DUQXZwqAnh+cGcBdA3DeQcu6vRZMCE3nRD8QCRmWLBYLPvvsMzQ0NGDixIl44IEHoFKpbvg8PM8jOjoa0dHRyMzsfblEIBCAzWbrFdw6OzvR3NyMqqqqXuFNoVBcMbyp1Wr6WUWGjE8++UT/+uuvJwUCAS4lJcW7devWuki3qSeVSsUGIpxdy6C6SYDjuF8A+AUAjBgxIsKt6Z+7sgPu79sR9eM0yOIGdCgaQsgQwBhDeXk5du7cCY7jsGDBAkyaNOmWvJdEIgmHrqysrF77AoEArFbrZVW3ixcvorKyEj17R5RK5WXhzWg0IiEhoVdXLCGDwfLlyy3Lly+3XPtI4MiRI6onnnhiZM9tcrlcPHnyZNVAtGXdunUXB+I8P8RAfGc2Aug5t1FqaNsNY4y9DeBtINjFefNNG1iiR4D1LzWQJaqhm5Ua6eYQQm4zl8uFkpISVFZWIi0tDY8++iiio6Ov/cJbQCKRIDY2FrGxsRg1alSvfYIg9BvezGYzKioqep0jKSkJqamp4Yder6dqGxkypk+f7q6qqqq89pFDz0AEtKMARnEcNxLBYLYEwGMDcN5Bx7anDgG7D4bHx4GT0C3yhAwn586dw+effw6n04nZs2dj5syZg3aoDKlUCqPRCKPReNk+QRBgsVjQ1tYGs9kMs9mMsrIyHD58GACg1WrDYc1kMiEpKQlyOd0I1c3lcqGlpSX8yM/PR3x8fKSbRe5ANx3QGGMCx3FFAPYgOMzGu4yxUxzHrQZQxhjbznHcNACfA4gB8DDHcf8vY2z8zb737eS90AXn4SZoZyRDMYKmcyJkuBAEAfv378c333yD2NhYLF26FMnJydd+4SAllUoRFxeHuLg4ZGdnAwh2l7a0tIQDm9lsRlVVsIeI4zgkJib2qrIZDIY7vsoWCATQ3t7eK4y1tLTAbr80nJVarca4ceMooJFbYkAuPmCM7QSws8+2f+2xfBTBrs8hiQkiLJ9WQ6JXIOr+9Eg3hxBym7S2tuLTTz9FS0sL8vLycP/999+R1SSJRILk5GQkJydj+vTpAACn09krsJWXl+Po0aMAAJVK1SuwpaSkQKlURvJL+MEYY3A4HJcFsba2tl4zQsTFxWHkyJFISEgIP2jYE3Ir0dWh18H+9wYIrS7ELhsPXnHDE9ITQoYYxhiOHDmCvXv3Qi6XY+nSpRgzZkykm3VbaTQajBkzJvx1i6LYq1vUbDajuro6fHx8fHyv0GY0GgddF7Df70dbW9tlYczlcoWP0el0SEhIQFZWVjiIGY1GSCT0s5/cXhTQrsHf6kLX3xqgyomDaqzh2i8ghAxpdrsdf/nLX1BbW4tRo0bhJz/5CbRabaSbFXE8z4cDS25uLgDA7XajsbExHNgqKytx7NgxAMFhP1JSUnqFNrX69tz5zhhDV1dXOIA1NzejpaUFHR0d4btbpVIp4uPjMWbMmF5VsdvVRhIZarV6isvlOj5Q55s1a9ao9957r666ulpRVFSUJpVKWVlZ2emCgoKsEydOaPLy8hx/+9vfan7IuSmgXQUTGSyfVoNXSBD9cEakm0MIucWqqqqwfft2+Hw+PPjgg8jLy6MurKtQqVTIysoKDwHCGAvfLdr9+Oqrr8KhKDY2tldgi4+Pv+nKlM/nQ2tr62VVMY/HEz4mOjoaCQkJyM7ODgcxg8Ew6Cp8w4kgCEN+iBeHw8FZLBZpZmam/9VXX00qLi5uevbZZzsB4MUXX2x2Op38O++8E/dDzz+0P51bzHmkGb4LXYhZNBoS7Z133QkhJMjn82H37t3heTQXLFiAuLgf/HN12OI4Lnz36OTJkwEE5yhtampCQ0MDzGYzampqUF5eDiA4rVVycnKv0KbT6fo9tyiKsFqtlwWxzs7O8DFyuRwJCQmYMGFCOIjFx8cP2evj+vWX50xorRzYMl98tgvz37jqSP1z5szJbGpqknu9Xn7FihUtoihytbW1ik2bNpkBYP369bFlZWWazZs317/55puGjRs3Jvj9fm7q1KnOzZs3X5BKpVCr1VMKCwvbDh48GLV+/fr6vXv36nbv3h3t9Xr5vLw8x5YtWy7wPI/S0lL18uXL03mex7333tt14MABfXV19SlBEPDcc8+lHjp0SOfz+bjly5e3rlq1qt8ZFi5cuCBbuHBhhsPhkAQCAW7Dhg0X5s2b5wCA559/PuXLL7/UK5VKsaSkpMZkMglbt27Vr1mzJsnv9/MxMTHCxx9/fM5kMgnFxcXJ586dU9TV1SksFov0hRdeaF65cmU7AOzcuVM3c+ZM+7p164w7duwwlJaW6nfv3q3fvn37+Z/85Cf2kpKS/v9nvk4U0K4gYPPCtus8FFnRUOfSHTqE3KmuNY8muTkKhQLp6enhKbAYY7Barb2qbN988034gvzo6OhwWON5vlcY8/v94fPGxsYiMTEROTk54TAWHR1NFc9bZMuWLXUJCQkBh8PBTZkyJfvAgQMGjY8bAAAgAElEQVRnZs6cORbB2YOwbds2w8svv9x07Ngx5bZt2wxlZWVVCoWCPf744yPeeuut2KKiog63283fddddznfeeccMAJMnT3b//ve/bwKA+fPnj/zoo4/0jz32mO3pp58euXHjxro5c+Y4n3322ZTuNvzxj3806vX6QEVFxenu6ZYefvjhrrFjx/r6tvfdd981zJ4927Z27dpmQRBgt9t5AHC73Xx+fr5jw4YNjStWrEjdsGFD3GuvvdY0d+5cx5IlS6p4nse6deuMq1evTuxu5+nTp1XffffdabvdLpkyZUr2woULbenp6f6dO3fqFyxYYH3kkUfshw4d0j700EO2n/3sZ9c1wO71oJ9CV2D5ay0gMsQ8StM5EXIn6jmPpk6nw7Jly254Hk1y4ziOQ0xMDGJiYjBx4kQAwYv3m5ubYTab0dDQgPr6+vCAuiqVCgkJCZg6dWo4iMXFxd2Rd9Nel2tUum6VtWvXJuzYsSMaAJqbm2Vnz55VmEwm7/79+zXjx4/31NbWKufOnetYs2ZNXEVFhTonJ2ccAHg8Hj4+Pl4AgncLL1u2LBxgdu3apVu3bl2ix+PhrVarNDs7293e3u5wOp38nDlznADw5JNPdu7duzcaAPbt2xdVVVWl3r59ewwA2O12SWVlpbK/gHb33Xc7n3nmmXS/388vWrTIMmPGDDcAyGQytmTJEhsA5ObmOvft2xcFAOfPn5fPnz8/ta2tTebz+XiTyeTtPldBQYFVq9UyrVYr5Ofnd3311Vea9PR069GjR7VvvfWW+dZ84hTQ+uWuaIensgP6gnRIY298bj1CyODWcx7NCRMm4MEHH/xB82iSgSGTyWAymWAymZCfnw8A6OrqAhC8q5L+SI6skpISXWlpqa6srKxKp9OJ06dPH+N2u/nFixd3fvjhhzFjx471FBQUWHieB2OMW7x4cccbb7xx2YxCcrlc7K5Ou1wubuXKlWnffvttZVZWlr+4uDjZ4/Fc9aJAxhj3+uuv1y9cuLDrWm0uKChwHDx48Mynn36qf+qpp0YWFRW1FBUVdUilUtZ97aFUKoUgCBwAFBUVjfjlL3/ZXFhYaCspKdGtXr06PNhh3///OI5DZWWlPCkpyadUKm/ZrEd0hWQfoluA5a81kCVroL1nyA7dRgjpB2MMJ06cwMaNG9Ha2ooFCxZg0aJFFM4GoaioKERFRVE4GwSsVqtEr9cHdDqdePz4cWV5ebkGAAoLC6179uyJ/q//+i9DYWFhJwDMmzevq6SkJKaxsVEKAC0tLZKzZ89eVu50uVw8ACQmJgo2m43/4osvYgDAaDQGNBqNeODAAQ0AfPDBB+HhE+bOnWvbuHFjnNfr5QDg5MmTiq6urn5zzNmzZ+Wpqan+lStXtj/xxBNtx44du+p1e3a7XTJixAg/ALz//vuxPfft2rUr2uVycc3NzZLDhw/r7rnnHudf//pX/f3333/NoHgzqILWh23XeYgOP4xPjgcnoR8MhNwp3G43SkpKcOrUKYwYMQILFiyI2DyahAwlCxcutL399ttxGRkZ4zMyMjw5OTlOAIiLiwtkZWV5qqurVffdd58LAHJzcz2vvPJK4+zZs0eLogiZTMbWr19fP3r06F7dkEajMVBYWNg2bty48XFxcUL3OQFg06ZNdStWrEjjeR75+fl2nU4XAIBf/epX7XV1dYqJEyeOY4xxBoPBv3Pnztr+2rxnzx7d+vXrE6VSKVOr1YEtW7acv9rX+PLLL19cunRppl6vF+655x57fX29onvfuHHjXDNmzBhjsVikL774YlN6erp/7969+o0bN9Zf6Xy5ubljzp07p3S73ZKEhIRJb775Zt31VP564rpvfx5s8vLyWFlZ2W19T+85G9rePgntf0tB9IM0rAYhd4pz587hL3/5CxwOB+67775BPY8mGfYuqwyUl5fX5eTk9Hu34p3IZrPxer1eBICXXnopsampSfbee+9F5Nq74uLiZK1WG1i9enVL97buGxQqKipO3+z5y8vLjTk5Oen97aMKWgjzi7B8Vg2JQYmouWmRbg4hZAAIgoADBw7g66+/RmxsLH7+858jJSXl2i8khETMJ598on/99deTAoEAl5KS4t26dWtdpNvUk0qlYgMRzq6FAlpI19/qIbS7Yfz5BPBymtKDkKFuuMyjScidZvny5Zbly5df13AVR44cUT3xxBMje26Ty+XiyZMnqwaiLevWrbs4EOf5ISigAfA3O2H/uxnqKfFQjoqJdHMIITeB5tEkZPiYPn26u6qqqjLS7bgVhn1AC0/npJJA/xBdd0bIUGa32/HXv/4VNTU1NI8mIWRIG/YBzfnNRfga7DD88xhINLJIN4cQ8gP1nEfzgQcewLRp02iIBkLIkDWsA5pg9cC25wIUo2Ogmkzz7hEyFPWcRzMxMRELFy6keTQJIUPesA1ojDFY/1ILMIaY+TSdEyFDUWNjIz799FN0dnZi5syZuO+++yIzjyZjgMcGuDouPQKXzT5z8+9xK3AcwiM7dC+Hfx5y19iG6zvusnNfz7bQaVjoP4z1fgYu39br+QqvY7jJ1/fZlzYDiAoPOk8GObVaPcXlch0fqPPNmjVr1HvvvVdXXV2tKCoqSpNKpezf//3fL7z44osmh8Mh4XmerVq1qul6b3roadgGNPfJdniqOqF/MANSgzLSzSGE3ABRFMPzaGq1Wjz55JMYOXLktV94vfzu3mHL2dF73dUOuDpD+9oBdycgCgP3/mToWLKVAtoPIAhCZP6YGkAOh4OzWCzSzMxM/6uvvppUXFzc9Oyzz3aePHlS8cEHH5yfOHGit66uTjZt2rRxjz76aJfRaAzcyPmH9qfzA4kuP6xf1EKWqoV2Jn1jETKYBAIB+P1++Hy+Kz6OHz9+/fNoigHAbQkGqSuFrO6g1b3ud17hZBygNgDqWEBtBAwZQOq04LrGGNoeC6gMgOxW/OE30JX+K1SLgCtUrK73uGtVovpu69ue0PMVq3ZXq/Jdo0rXb6XvCue52uu79w2TcDZnzpzMpqYmudfr5VesWNEiiiJXW1ur2LRpkxkA1q9fH1tWVqbZvHlz/ZtvvmnYuHFjgt/v56ZOnercvHnzBalUCrVaPaWwsLDt4MGDUevXr6/fu3evbvfu3dFer5fPy8tzbNmy5QLP8ygtLVUvX748ned53HvvvV0HDhzQV1dXnxIEAc8991zqoUOHdD6fj1u+fHnrqlWr+h3A98KFC7KFCxdmOBwOSSAQ4DZs2HBh3rx5DgB4/vnnU7788ku9UqkUS0pKakwmk7B161b9mjVrkvx+Px8TEyN8/PHH50wmk1BcXJx87tw5RV1dncJisUhfeOGF5pUrV7YDwM6dO3UzZ860r1u3zrhjxw5DaWmpfvfu3frt27eHZy1IT0/3GwwGoampSUoB7TpYd5yH6PLD+NQEcDx1bRLyQzDGrhqkrhWyrvQaQbh2JUohl+HRe6dgUrIC3JnP+w9Z3SHMbUWPBNCbXHcpcGnigPhxoZBlCAaw7sDVHcCUeoCncRJJ5Lx66FVTjaXmqvNK3qismCzX/575v686Uv+WLVvqEhISAg6Hg5syZUr2gQMHzsycOXMsADMAbNu2zfDyyy83HTt2TLlt2zZDWVlZlUKhYI8//viIt956K7aoqKjD7Xbzd911l/Odd94xA8DkyZPdv//975sAYP78+SM/+ugj/WOPPWZ7+umnR27cuLFuzpw5zmeffTY8svQf//hHo16vD1RUVJzuHs3/4Ycf7ho7duxl1xO8++67htmzZ9vWrl3bLAgC7HY7DwBut5vPz893bNiwoXHFihWpGzZsiHvttdea5s6d61iyZEkVz/NYt26dcfXq1Ynd7Tx9+rTqu+++O2232yVTpkzJXrhwoS09Pd2/c+dO/YIFC6yPPPKI/dChQ9qHHnrI9rOf/axXV+bf/vY3td/v57Kzs703+u8y7AKap8YK13ct0P1TKuTJdPs9Id2cTifMZjMaGxvhdruvK2zdCJmEh1zKQS4B5DyDnGdQ8gFEIQA5L0Cu9EGu8EHOvJAxD+SiB/KAC/KAE/KAA/KAHXL4IYcfGp8LstI+f4zysh6VLAOQOKFPyDLcpioXIXeetWvXJuzYsSMaAJqbm2Vnz55VmEwm7/79+zXjx4/31NbWKufOnetYs2ZNXEVFhTonJ2ccAHg8Hj4+Pl4AAIlEgmXLloUDzK5du3Tr1q1L9Hg8vNVqlWZnZ7vb29sdTqeTnzNnjhMAnnzyyc69e/dGA8C+ffuiqqqq1Nu3b48BghOcV1ZWKvsLaHfffbfzmWeeSff7/fyiRYssM2bMcAOATCZjS5YssQFAbm6uc9++fVEAcP78efn8+fNT29raZD6fjzeZTOFAVVBQYNVqtUyr1Qr5+fldX331lSY9Pd169OhR7VtvvWW+0md24cIF2c9+9rOM//iP/zgvkdz4H3bDKqAxfwCWz6shjVUiavaISDeHkNsvIAA+BwIeO1qbL8Lc2IiGplaYW6zodHgABDtvlFKEgpQIOSdCzgeghx8y+CFnPsilXsglHshFN+SiC3LBBTlzhQOUHL7wsiz04AMA+hb4ZWpApgJkmuCzXB3apg4tGy8dI9dcOlah7V3dUscCCl2PbitC7kzXqnTdCiUlJbrS0lJdWVlZlU6nE6dPnz7G7Xbzixcv7vzwww9jxo4d6ykoKLDwPA/GGLd48eKON954o7HveeRyudh93ZnL5eJWrlyZ9u2331ZmZWX5i4uLkz0ez1UnyGWMca+//nr99Uw6XlBQ4Dh48OCZTz/9VP/UU0+NLCoqaikqKuqQSqWsex5eqVQKQRA4ACgqKhrxy1/+srmwsNBWUlKiW716dbjvuu9NhBzHobKyUp6UlORTKpX9luc7Ozv5goKCrN/85jeNs2fPvtI1E1c1rAJa1/56BDo8MC6fCE5G3RRkEBNFwO8CfE7A5+i97HNe4RHa53dddpzLK8Ds06FBNMKMJJiRCD+C0x5p4EQqmjAVTTChCUlogVwQAIELhSI1IO0ZkHoGqJjLg1U4TPU8rmcICz1LVQBNWE7IoGe1WiV6vT6g0+nE48ePK8vLyzUAUFhYaJ06dWrS999/71uzZo0ZAObNm9e1YMGCrJdeeqklJSVFaGlpkdhsNsno0aN7VblcLhcPAImJiYLNZuO/+OKLmIcffthiNBoDGo1GPHDggOZHP/qR84MPPjB0v2bu3Lm2jRs3xj300EN2hULBTp48qUhPT/dHRUWJfdt89uxZeUZGhm/lypXtXq+XO3bsmBpAx5W+RrvdLhkxYoQfAN5///3Ynvt27doV/dvf/rapq6uLP3z4sO4Pf/hD44cffhh9//339xsUPR4P9+CDD2YtWbKko2+X540YNgHNd9EB+0Ez1HkJUGZGR7o55FYQxeDF3QE/wMTgXXViAGCB0HN/20LPP2SbKISWxevcFmqD4O0nePUMX66rXKTen1CQCj1EqQZtXBwaxAw0+KJg9qjR4Q1+q3MckKiTYrJBBZMxCqmJsYiJMYBT6EKvVweDlFwDSBVUkSKEYOHChba33347LiMjY3xGRoYnJyfHCQBxcXGBrKwsT3V1teq+++5zAUBubq7nlVdeaZw9e/ZoURQhk8nY+vXr6/sGNKPRGCgsLGwbN27c+Li4OKH7nACwadOmuhUrVqTxPI/8/Hy7TqcLAMCvfvWr9rq6OsXEiRPHMcY4g8Hg37lzZ21/bd6zZ49u/fr1iVKplKnV6sCWLVvO93dct5dffvni0qVLM/V6vXDPPffY6+vrFd37xo0b55oxY8YYi8UiffHFF5vS09P9e/fu1W/cuLG+v3O9++67MUePHtVaLBbp1q1bjaFt57u7Wa8Xx27V2Do3KS8vj5WVlQ3IuZjI0PrmCQSsXiQW54JX04wBN0IQBJw9exYnT54EYwwpKSlITU1FcnIylMoBvoZH8AYv6vbYAE/o2W0NLfdc77vfBni7ggFosOGlACcJXlzOSQCpPBSGtJcqTd3L/T6udJwWkKvh9gPmixfR0NAAs9kMs9kcvj5MrVYjNTUVJpMJqampSElJoQnDCRncLvurqLy8vC4nJ6ffuxXvRDabjdfr9SIAvPTSS4lNTU2y995777Z37QJAcXFxslarDaxevbqle1v3DQoVFRWnb/b85eXlxpycnPT+9g2LCpro8oPjOUQ/kknh7AY0NTXhxIkTOHnyJNxuN7RaLRQKBc6cORM+xmg0hn/xp6SkICEuDpKA68oh6lrrwjX+wJCqgnfSqaKDz9pEwDjm0roiCpDILgWi7lDU87nXcnd44q9zmwTg+OvcJhnwLjxRFNHe3o6G8w0wm0+ioaEB7e3Bn9scxyE+Ph6TJk0KBzKDwUCDMBNChpRPPvlE//rrrycFAgEuJSXFu3Xr1rpIt6knlUrFBiKcXcuwqKABwSpacAgb+mV1RYIPTmsbvv/+JE6cOoPmdiskPIcxyXpMTlUjMxqQ+Lrgsltx0eJCo1WA2SlBo1cFlxisykjhRxJakYpmpIQe0ejq8Sch1ztgKaOvsB7T/36p4kqtvyN5PJ5wVay7Qub1Bm8uUqlUl1XHFIrh9fkQcgca9hW0G3HkyBHVE0880WuUarlcLp48ebIqUm26EcO+ggbgzhvvjLHgaOc+R/DhdVy6jslr77HsuHSMzxna170cfA54HajxxuIEG4MzyIAICZLQggKcwkTxDNRmT2ikmyC1VIUspR5ZqmjAqAdTRMMiMaBRiEajRw2zIw1H7KkIiMHwr1EpkJIUj9RUE1JMI5Gcmnr1gUWHKVEU0dHREQ5iDQ0NaGtrC++Pj4/HhAkTwqEsNjaW/uAghAxr06dPd1dVVVVGuh23wrAJaIOaxwaYy4C2qmCAumrA6rF+vddbSeSh65W0weEJQsttchOO24046VLBwXioZcD0ZDUmj4xFYlwOIF8IKLRgMjU6Ox2oOXUWfkGEOiYOmugYaKKjg88xBhiUKhgATAy9pSAIaG1tDY+r1djYiLMHvwbwNYBg12h3t2hqaioSEhLwQ8aJGco8Hg8aGxt7Vcc8nuBQF0qlEqmpqRg/fjxMJhNSUlIG/no/QgghgxYFtNtNDASDmPlo6FEGtJ1Br5HOpapeQSo45pMRiE4LbQ/dcdf3GLkmuC+8HNonvXRRuNvtRkVFBU6cOIHGxkZwHIfRo0dj8uTJGDVqVHhuNCaKaK6tRvU/vkHNkW9gaQoOacPxPJh4eTCUKZTQRMdA3TO46YPrk0akIH/SBEhUalgdDlxsakZjYyNqampQXl4e/JKlUiQlJfUKbdHR0UO6QiQIAlwuF5xOZ6/ntrY2mM1mtLSErzlFXFwcsrOze1XHeBqCghBChi0KaLeasz0YwsxHAfMRoPFYsPoFBEcyT50GTFgEmKYBiZNuyVQyoiji/PnzOH78OKqqqiAIAuLj43H//fdj0qRJ0GqDMyoEBAEXTp5A9dFvUHv0GzgsneAlEpjGT8LUB36CrLy7oImOgdthh9NqgdNqgSv0HF63WdBhbkBDxUl4nI5+26PURUEbHYM0fQxkMXoIciXcjENXlw1HGxtxOBQA1Wp1OKx1B7dIdo0KggCn03lZ4LrStu5rxfpSKBRITU3F2LFjw9Ux6vIlhBDSEwW0gRTwAy0VQMPRSxUyS2joFU4CJE4EcpYGQ1lqXnCi5VtYIero6EB5eTlOnDiBrq4uKJVKTJkyBZMnT0ZycjI4joPf60H1ka9Rc+QbnDt2FB6nA1KFAiNzcpE1PR8ZU6ZBqe09JZY6Sg91lB5xI9Kv+v6C3w+XrTu8WS8Lc06bBV01TXBaLBD8wWEhVOAgKpQIqLTwq7U432VF9dmz4c9JLZchNioKCfFxSE01wZSejqjYWMgUN9795/f7rxiu+gteV5raiOd5qNVqqNVqaDQaJCcnh5f7Pms0GiiVSqqOEUIIuSoKaDej62LvrsqLxwEheA0RtInBqljez4KBLGlycBDQW8zr9aKyshLHjx9HfX09OI5DZmYm7r//fowZMwYymQxuhx2VBw+g5ug3qCs/DsHnhVKjRUbudIyaPgNpkyb/oMDTl1QmQ5QxHlHG+KsexxiDz+2+VJGzWeC0dIaDXZfVAou9C3avHx5OArPLhYb2DpRVVgGiCN7jglzwQiPhoddqkZA5Cob0TEg12qsGLr/f3297eJ7vFaoMBsM1A9dQ7oolhJDhQq1WT3G5XMcH6nyzZs0a9d5779VVV1crioqK0qRSKdu+fXvN/PnzM0VR5ARB4H7xi1+0/vrXv2679tl6o4B2vfweoKm8RyA7CnSFphqTyIMBLO/nwcqYaToQlXLF6pjfG0BXuxu2Nje62t3oanPD1u6Gx+GHUiODKkoOlU4OtU4OVZTs0nJoXSLpXX1hjOHChQs4ceIETp06Bb/fD4PBgNmzZyMnJwdRUVGwd7bj1IEvUX3kazRUfg8mitAaYjHhvjnImpaP1HETIJFG5n8HjuOgUKuhUKthSE656rGiGIDLZkNLoxkNFy7gYnMz2jotsLnc8DCGDgacq6kDaurCr5F0B65QoDIajVcNXAqFggIXIYTcQoIghK95HqocDgdnsVikmZmZ/ldffTWpuLi46dlnn+30eDzcd999V6VSqZjNZuOzs7PH//SnP7Wmp6f3XxW4gqH96dwqjAHWC8GqWMORYBhr/h4QQ59t9AhgxN1A6vRgdSxxQq/xuRhjcHX50NXuQVebC7Z2TzCEhQKZq6t3V5kUPqh9FsgDLjhTM2BpVsBl9yHg7/8uTYVaCnWUHBKNADt/Ee2eenj8DkglMqSnZGH82IlIz0yDz9WO0wf2oPboYTTVBAeXjUlOxbSHFyBrej4SM0aBG2JdbTwvgTbGAG2MAZkTJoW3BwIBtLa2oq2tDczvg6W+Ds1nTqH51PcQfB7IVGqkTM5F1vgxSJ+SB5VWF8GvghByJUJnJ5xffwPn4W8gjY5GzOOPQ5aYGOlmDXtz5szJbGpqknu9Xn7FihUtoihytbW1ik2bNpkBYP369bFlZWWazZs317/55puGjRs3Jvj9fm7q1KnOzZs3X5BKpVCr1VMKCwvbDh48GLV+/fr6vXv36nbv3h3t9Xr5vLw8x5YtWy7wPI/S0lL18uXL03mex7333tt14MABfXV19SlBEPDcc8+lHjp0SOfz+bjly5e3rlq1qt/x4S5cuCBbuHBhhsPhkAQCAW7Dhg0X5s2b5wCA559/PuXLL7/UK5VKsaSkpMZkMglbt27Vr1mzJsnv9/MxMTHCxx9/fM5kMgnFxcXJ586dU9TV1SksFov0hRdeaF65cmU7AOzcuVM3c+ZM+7p164w7duwwlJaW6nfv3q3fvn17eFopt9vNif3cWHc9hs1AtVfldQAXj13qqjQfBZyhaqRMDaTkBitjqdOAlDxAl4CAIMLe4YGtRwWsKxTAbO0eCN5AjzdgUMENlbcTClsjVI4WqNxtUHnaoRZsUCcbIU9Ph6+uDr7aWhiWLYPxV/8TASaBq8sHt90Pt90HV5cPDpsL9RfPobGjFjZfKwBAIURD7kiA3BMLCB0I+Gsg+mrAxOC8sFJFIjSGbMSkTER0QmqoGieDurtSF3pWamXg77Dx4vxeD+orylFb9i1qvzsCl80KjueROnY8MvPuQkbudMQkJke6mYQMW6LXC/exY3AeOgTH11/DWxkcoJ2PioLodAI8D/1DDyH26Z9DkZkZ4dbeFlcdqPbiSy+bvNXVA3q9jGLUKFfy//fbq06l1NLSIklISAg4HA5uypQp2QcOHDgzc+bMsfX19RVAsKvv5ZdfboqLixNefPHF1F27dtUqFAr2+OOPj7j77rudRUVFHRzH5b7zzjvnnn76aUvPcwLA/PnzR/70pz/tfOyxx2yjRo0av3Hjxro5c+Y4n3322ZS9e/dGV1dXn/r9739vbG1tlb322mtN3dMtbdu2rXbs2LGXXSD8m9/8JsHj8XBr165tFgQBdrudj4mJETmOy92yZUvNY489ZluxYkVqVFRU4LXXXmtqa2uTxMbGBniex7p164ynT59WvvPOO+bi4uLkHTt2RH/33Xen7Xa7ZMqUKdmHDx8+nZ6e7l+2bJlpwYIF1kceecS+cOHC9IceesjWPTl6TU2N7IEHHhjV0NCg+Nd//Vfzv/zLv/TbxUkD1fYkikBHTe9rx1pPXRpTLHYUkDUXSM2D15gLG9Jg6/AFuyKPumHb1Yiutlo4LB70zLYSToQaTqg87UiymqHqaoLS3R4MYoINypQkyNPSIJ+WBlna9OByWjpkSYngQuN/iR4PWl/7HTrffx/Ob79Fyu9/h+jMTOjjGMxmM843nEBFRQW8Xi/0ej3uzb8XkyZOhKvlIs5++w1qju6Bw94OcBwMKaMRm/ojaGOzIYqacMBrqOyE2+6DGOgnmHOAShvsUg12q/btbpUjMSMKKu3QmctRplAiM/cuZObeFR46pPa7b1Fb9i3+vvlP+PvmP8GQYkJmXvCYpFGjwQ/wXbSEkEsYY/CePQvnoa/hPHQIrrIyMK8XkMmgnjwZcf/zl9DMnAlldjb8Tc3ofP99WLdtg+3zz6GdPRuxT/8c6ilTIv1lDDtr165N2LFjRzQANDc3y86ePaswmUze/fv3a8aPH++pra1Vzp0717FmzZq4iooKdU5OzjgA8Hg8fHx8vAAAEokEy5Yts3Sfc9euXbp169Ylejwe3mq1SrOzs93t7e0Op9PJz5kzxwkATz75ZOfevXujAWDfvn1RVVVV6u3bt8cAgN1ul1RWVir7C2h3332385lnnkn3+/38okWLLN0TlctkMrZkyRIbAOTm5jr37dsXBQDnz5+Xz58/P7WtrU3m8/l4k8kUvg2/oKDAqtVqmVarFfLz87u++uorTXp6uvXo0aPat956y9z3vQEgKyvLf/bs2cq6ujrZww8/nPX4449bTCaTcAki66UAACAASURBVCOf+YAENI7j5gH4PwAkAP7EGFvTZ78CwGYAuQA6APwzY6xuIN77ungdwDf/fimQeawAAFEeDWf8LNjGLEGXfCxsgUR0WYGuKjdsX7nhdXUC6AyfRiEVoGYO6FxtiOush8LaCJWnHSp3O+SCHfLkZMjT0yGfmgZ52kzI09MgT0uDLCUF3HX0tfNKJRL/9VVo/ts9aHrpZVQ+Voj2J/47zjCG9vZ2SKVSZGdnY9KECUBXJ2qOHsZHW96Bx94FiUyGtElTMPOnhcjInQ51lP6K78MYg9clwG33hYJbqEJn98Edqti5unxovWCHy+6D33OpGihXSpD34EhMui8VEunQ6h7leB5Jo8YgadQY3LPkCdham1H73RHUln2L70o+x9G/boMqSo+MqdOQmTsdaZOmQK6k4S8IuVn+1lY4v/46+PjmGwTagr1S8sxMRP/zT6GZMQOaadPAazS9XidPTUHiKy/D+NyzsPznFlj+8z9xYf9+qPJyYVy+HJpZs4bd9aLXqnTdCiUlJbrS0lJdWVlZlU6nE6dPnz7G7Xbzixcv7vzwww9jxo4d6ykoKLDwPA/GGLd48eKON954o7HveeRyudh93ZnL5eJWrlyZ9n/ZO/PwqKo0/3/vUvuWqlRl30lCFhYhCCTQopLQpAdphogDxnaFhtG0jgF/PQ32dDczzoAKOmEU0R6xYQAX7OlRVglRUBBIQAkhBGIgK9lTqX27y++PWpJA2CQhwdzP89znnnPuuafeuhSpb73nnPc9fvx4ZWJioqewsDDC6XRe90uF53li3bp19Xl5eeYb2Zybm2s9fPjw+U8//VTz9NNPxxcUFLQWFBR00jTN+3fR0zQNhmEIACgoKIh54YUXWvLz8027du1SrV69OjC1cuVnjCAIVFZWisPDw91SqfS605BxcXGelJQUR3FxscrvXbtZblugEQRBAXgLQA68CYFKCYL4jOf53qkXngFg5Hk+kSCIhQDWAviH233tm4XhRXim7DuEuCOhZ6ZAyUWAdunBtovB1/d+L61QiDyQcxaE2tsg6aiDtKseMkc7pM5O0KwLdHi4V3hNiIU49n6vJywuFqKoKJDi2/csMQyDhtBQnPr1EtTU1oJvb0eIx4Ofz3wQCgqoO1WKPX/bDo/LCbFMjoSJ9yJpcibi7sm4aTFBEASkChGkChG0YYob9mfcLBxWDyxdTpzaV4ejn/6Aym8uY/ojSYhND77dtzxkaELCMDF3LibmzoXTZkXt9ydRc/IEfij9Fme/KgYlEiFmzHiMypiChIx7odLph9pkAYG7As7hgL2sDLZvjsB29Chc1dUAAEqngyIz0yvIpmXd9NoyWquF4TcFCH7maXTv3InOzR+gYekySJKTEbz4Gahzc0GIRIP5lkY03d3dlEajYVUqFffdd99JT58+rQCA/Pz87okTJ4afOXPGvWbNmkYAmD17tnn+/PmJK1eubI2MjGRaW1spk8lEJScn9/Fy2e12EgDCwsIYk8lEfv7559qHHnrIqNfrWYVCwZWUlCgefPBB29atW3X+e3JyckwbN240zJkzxyKRSPjy8nJJXFycR61WX7XI68KFC+KEhAT38uXLO1wuF3Hq1Ck5vA6ifrFYLFRMTIwHAD744IM+X2x79+4NeuWVV5rNZjN57Ngx1RtvvNG0Y8eOoFmzZvUrFGtqakShoaGMUqnk29vbqdLSUuVLL73U2l/f6zEQHrTJAH7gef4iABAE8SGAXwLoLdB+CeCPvvJOAP9FEATB36EFcA7eg4wfHgNAgSM8cKETrKccMmcXgsxGaM1dkLiNEHvMIMDDpVEC4QaIUqOgjJkGSUQU6LAwUCEhgEjk9UDxPBwcB57nwXMc+Lo6cP6677jVutFoxJkzZ+BwOKBSqTD53kkQnSrD5e+P4dvq78ETBBRBWqT+7H4k3ZuJ6DHjQNGD/0eJFlNQ6SiodFJEFASh9kwHvvmkGrs2nEbcOD2mPZyIoJDBDyEymEgVSqRMm4GUaTPAMgyaqip9U6HHcOm7MuDPQGhCEkZlTMaoSVNgiI0fcb/cBQSuBc9xcFaeC3jJHCdPgvd4QIjFkGVMRMgv50KRlQVJSsptbUwi5XLoHn8c2kWLYN6zB51//m9c/n+/RdubbyL4qacRlDcfpHxg/xbxPA+P0/sj1Wn1wGF1+84eOK1upGSG39QP3buZvLw807vvvmtISEhIT0hIcI4fP94GAAaDgU1MTHRWV1fLHnjgATsAZGRkOF9++eWmmTNnJnMcB5FIxBcVFdVfKdD0ej2bn5/fnpqamm4wGBj/mACwadOm2mXLlsWSJInMzEyLSqViAeDFF1/sqK2tlYwdOzaV53lCp9N59uzZU9Ofzfv371cVFRWF0TTNy+Vydtu2bZf66+dn1apVlxctWjRKo9Ew06dPt9TX1wd2/qWmptqzsrJGG41GesWKFc1xcXGeAwcOaDZu3Fjf31jl5eWy3/72t1EEQYDneRQUFLRMnjzZcbPP289tbxIgCOJhALN5nl/sq/8KwBSe5wt69anw9Wn01Wt8ffrdfQEM7CYBt9mMf1+/fkDGGkw4ELCLtOAdTqjaLkJpagQBwCNWIbqjC/GtrTibcj++nJ4Hlhra5YMExyOynUFMCwOSBxoNNBrCaLDUT0y08Dxktg4EdVQjqOMClOYmEABcEjW69Uno1ifBrI0FT4685ZwCIxu1pQuj6s5iVN1ZxDdUQuHwZg5p0UfhYmw6amLSUR+ZBI9IcoORbgOeR9Klckwv3YPYy9WwS5U4fs9MnLhnJhwyZb+3EBwPEQuIGD5w0AwgZnjQDA8R0+sa662T1/ia5AjAmqHB7xZnDOS7uu4mgZGAyWQiNRoNBwArV64Ma25uFm3evPmOT+0CQGFhYYRSqWRXr14d8ID5NyhUVFScu93x75pNAgRB/BrArwEgJiZmwMal5HJoTDZYeQIMLYKHor0xyjgWBM+B4FiQPAdwLEieBVgPCN4DgmdAcCwIjgXF8SA53//SgKjlAR4gfOdAPk2+79l/nSMI8AQFnqDAkRQ4kgZPUuAICjxJg3bbEW4rBQDYlCFoiv8ZjIbRcChCUMG4Mevwx5h+aj9GNZ7Dp7lL0aELH7BndKvwJIHGUBHadDTiLnsQ08YgtIvBpUgR2rTUoGZIuKMQBBxKAxxKA5rjskC7rQjq+AHajmrom8sR2nQSLCWGSZcAoz4ZJv0oMKK725soINAfYrcDcQ3nMareK8r0xhYAgEWuQXXcOFyMTcfFmDRYFdde/zoYXIoZh8aIsYhsaUT6he8wurYNMS170RQ2Gm2GOPCExCe4vMKLvk7EAw8FeGgCHpqAU0LAQpPeOgUwvnbv4e3HkkBa6O0H9Rboy8cff6xZt25dOMuyRGRkpGv79u21Q21Tb2QyGT8Q4uxGDIQHLRPAH3me/7mv/jsA4Hn+P3r12e/r8y1BEDSAFgCG601x3tEwGzcJz3GwOa1oNNaj0VSPJmMDWsxNaDO3oN3cik5rOziGAcUSoDgCNEtAS2ugFWkQRKmhJpVQEnIoCCmkEINkvemQGLcLrMcDkqYRf08GEu/NRFBo/2szLCUlaF65CpzTidDf/Q5BjywYFlNtLZdM+PrDC2irsyA0Xo2f/UMyQuPUQ23WoOJxu9BQUe4N4XHqBGzGLhAEiciUNCRkTMaojCk3DLwrIDBc4VkWzooKWI9415E5vj8NMAwIqRTye+8NrCOTJCUN+N8gt5NBR6MVlk5nP9OKPdOLTqsH1/oWIVkXRB4bZHIKiugQKEKDIFWKIFOKfWcRZCoRpApvXaqgQVLDYuPTiPeg3QonTpyQPf744/G928RiMVdeXl41VDbdCtfzoA2EQKMBXAAwE0ATgFIAj/I8f7ZXn+cAjOV5fplvk8B8nucfud64w1Gg3Qie59Hh6ECDpQGN1kY0Whq9ZUsjGq2N6HD0/f+lEqkQpYoKHPHqePws6mfQy66/GN3T2obm3/0zbEe/hSonG2GrV4PWagfzrd0UPMej6lgLvv1bDRwWN1IzwzF13ijI1XdPWI4fC89xaL34g3fd2skTaK/zLnfQRkQF1q1FJKcIITwEhjXuhgZv+IujR2E7dgyc2QwQBKSpqVBMmwbFtCzIJk4ckA1RfhwWN9obLGivt6CjwYr2BgtM7Y7AhATgdchLlSJIlWKvsFKKfPUrBZc40I7ONnR98BcYP/kEvN0O5YwZCP71EsgzBnQ6cjAQBNoIYlAFGgAQBPELAG/CG2bjfZ7nXyEIYjWAMp7nPyMIQgpgK4AJ8MatWOjfVHAt7kaBdiPsHjuarE09wq2XiGuyNsHDeUCAwMTQiciOyUZ2bDbCFP170niOQ9cHf0HbG2+A1moR8epaKKZOvcPvqH/cDgale2pRXtIAWkTi3jnxGHv/3ReW43Ywt7cFxFrD2TPgWAZSlRoJEyYhKm2ML0epAcpgPUTiQVyj8xOEczrhaWwEz3Le+IUcB57je5U57/ICf9l37bplnr+5fleWOc5X76dMkiBo2nuIaICmQVDeMkH76lccV7aBpkGIRCAoqm+dpr3xE2n6trxXrNkM27FjvsX938JT713zTIeHQ5GVCeW0aZBnZg7ID0Ce52HpcqKj3ivCOhosaG+wwtYdCDcFVbAUhmgV9NFKGGJU0BhkkKnEkMhoED8yiDbb3Y2u7dth3Po/YI1GyCZMQPCSxVDef/9wzaQiCLQRxKALtMHgpyjQrgfHc6g2VuNg/UEcqDuAH7p/AACM1Y9Fdmw2cmJyEK2Ovuo+Z2Ulmla8BPelSwh+5mkYnn8exAD+ur0dulvt+PrjatSf7URQqPyuD8vxY3HZbag9fQo1J0/g0qlSOG3WPtdlag1UwXqo9Qaogg1Q6Q09db0BiiDtiPa88RznC2x6BLYjR2E/edIb2FTAS29RR1GAX8AFRB4FghZdVedsNjjPngU4DqRcDvmUKYFpS3H87e1S5jge3a12n1fMK8Q6Gixw2b1xOgkCCApTwOATYvpoFfRRSkgVg7crnXM40P3Xv6Lr/c3wNDVBnDgKwU8/A82cvxs2fzN9CAJtBCEItLuQWlMtiuuLcaDuACo7vRFLRmtHe8VabA5GBfWkPOEcDrSuWYvujz6CND0dEa+/Bkl8/LWGvuP4w3KY2hw/mbAcPxaOZWFub4O5ox2WznZYOtph6eyA2Vc2d7TD4+y7G5ukKCh1wVAF6wMCTh1sgErvrav1IZAoFMNiLeJA4WntFdj06FGwnd7wRZKkRCiysiAdOw6EWOT1gJAkQBA9ZRAA6asTZE/5yn4E6fXKXK/sG+9a1wii/7J3ExIHnmHAezwAw4Bn2b51hgHPsOCZ3nUGvIcBzzI9bR7G24dlfWX/4QEY9oq6f8xedY//tXvVGQYETUM2KQPKadMgGz/+R8cRYz0cOi9b+0xRdjZZwbi9q/EpmkRwpAL6GFXAOxYcqYRIPDQ/OniGgXnvPnT++c9wnT8POiwMuieeQNCCBaCUwyJchiDQRhCCQLvLabI2obiuGMV1xfi+/XsAQLwmPjANmqpLBUEQsBQXo3nVy+DcboSu/B2CHn542Hxpsx4Op0saULanFizL4Z6ZMcjIjYVYOqw2Eg853iwPNlg6O3zird0n5nrqls5OcGzfjCEiibSv5+0KT9xwn0rl7HZvYNMjR2E7egSuaq8HmQoO9gY2nTYNiqxMiEJDh9jSkY3b4V2839szZmy2gfPtcBdLKeijfUIsRglDtApBYXJQw2PxfR94noftm2/Q+d6fYT9xAqRGA+2ji6D71a9A63Q3HmDwEATaCEIQaD8h2uxtKKkvQXFdMUpbS8HxHCKVkT1ijQtFyz+vhP3YMahmzUL46j+BCgoaarMD2EwuHPvfGlQda4FcI0bW/EQkTw4dNkLyboDnONhM3X0FnN8T52uzm7qvuk+m1vjEm94n3gx96ndyKrVPYNMjR+A4dSoQ2FQ+KcMnyLIgGT16uK4T+sljN7t71orVe6coTe093l2ZWgxDtAqGaKVXlMUooQ6W/ei1YkOJ4/RpdP75z7AUHwQhFiMoLw+6p5+COCpqKMwRBNogIpfLJ9jt9u8Garz77rsvafPmzbXV1dWSgoKCWJqm+bKysnNKpZLv6uoiU1NTx/z85z/v3rJlS79BbQWB9hPF6DTiy4YvcaDuAI41HwPDMQiRhWBm9IP4xXEPJH/+FLRej4i1a6GYMnmoze1Dy0UTvv7IG5YjLMEbliMk9qcdluNOwrjdsHZ1/qipVKVOD5FEAoIkQZIkCJICSfnO/jbKV6YoECTp6+stk75r/v7+vrzNBk9tHTyXLsF98SJgt4PgAVFYGKSjR0M2ejSkSYmgJNLAOP5xr3xNv9gUuH14noel0+kTY74F/PUW2Ew9gd/Vev/i/Z4F/ArN8PXI/lhcFy+h8/3/hun/PgM4DurZsxG8ZDGkKSl30owRIdAYhoE/L+edZCAFmtVqJTIzM1POnDlz7tFHH42ZPn269dlnnw0k8H7qqaeiOzo6aK1WywoCbQRjdptxqOEQiuuKceTyEbhYF+7pUqHgbwxUbVZon3kaYS+8MKzy1XnDcjTj279dHHFhOYaa602lWjs7wHjc4DkOHMuB51hwHOdNTcaxvjYOHMf6+vjOHAee9fdlcc0AVQNEVNoYjH1gFpKmZEEkGbpgoRzLwe1k4XYwcDsZuB0s3E4GHMOD9wWp5nnvMwfgq/O94ln39AF87Vf0ga/cXx/f5cA4/jH9ffz9vfWeMT1OFh1NXlEWWLxPEtCGyXt2UvrOEvnw+btxJ/C0tqLrL1vQ/dFH4Gw2KKZPR/DixZBPmXwnvP3DUqBlZ2ePam5uFrtcLnLZsmWtHMcRNTU1kk2bNjUCQFFRUXBZWZliy5Yt9W+//bZu48aNoR6Ph5g4caJty5YtdTRNQy6XT8jPz28/fPiwuqioqP7AgQOqffv2BblcLnLSpEnWbdu21ZEkiUOHDsmXLFkSR5IkZsyYYS4pKdFUV1efZRgGzz33XNSRI0dUbrebWLJkSdtLL73U73Opq6sT5eXlJVitVoplWWLDhg11s2fPtsrl8gnPPPNM2xdffKGRSqXcrl27foiOjma2b9+uWbNmTbjH4yG1Wi3z0UcfXYyOjmYKCwsjLl68KKmtrZUYjUb6+eefb1m+fHkHAHz88cfqkpISdXJysvNPf/pTlFKpZDMyMqyfffbZpa+//lq+du3asFmzZpn8z6U/OwWBNsKwe+z4pukbFNcV49uLX+GR/VZkf8+jIzYI3B9fwJR750FKD5/o14GwHAcbQIt9YTkeiBqW61YEro0/sKl32vIobN9/D55hAJkUsowMyKZMgezeSRDFxXlz0QaEnffsF3tXCr3e1/z9O+rrUPHlAXS3NkMiVyBl+v0Y++AshMaPurGhfns5Hh6XV0y5HAw8ThYuB+MVWg7GK7qcvcrXuOZfDH/XQHgVAEWT0EUqe6Yoo1UIjlSAvgOL93meh8VjQYejA52OTnQ6Or1lZ2egrcPRAY1Eg3vD7sW9YfdiTPAYiKg7KxRZsxnGHR+ia8sWsJ2dkI4bh+DFz0CVnT2YU+/XFWgHt5yL7mqyDuguK12k0j7z8dTrplJqbW2lQkNDWavVSkyYMCGtpKTk/LRp01Lq6+srAO9U36pVq5oNBgOzYsWKqL1799ZIJBL+sccei5k6daqtoKCgkyCIjPfee+/i4sWLjb3HBIB58+bFP/LII12PPvqoKSkpKX3jxo212dnZtmeffTbywIEDQdXV1Wdff/11fVtbm+jVV19t9qdb2rlzZ01KSor7Snv/8Ic/hDqdTmLt2rUtDMPAYrGQWq2WIwgiY9u2bT88+uijpmXLlkWp1Wr21VdfbW5vb6eCg4NZkiSxfv16/blz56TvvfdeY2FhYcTu3buDTp48ec5isVATJkxIO3bs2Lm4uDjPk08+GT1//vzuuXPnWvLy8uLmzJljeuqpp4wsyyIzM3P0jh07Lu7evVv9YwWasEL7J4hcJMesuFmYFTcLrukufJv9LQ799QNM2nwC1K//hFWz14L4uweRHZuD+6Lug3yIUxOJZTSm5SUibVo4vvmkGkd2/oDKby5j+oIkxFwnLIfVbUWbvQ2t9la02dvQZm9Dp9O7248kSFAE1f+ZpPq/Rl7nnl73kgQJkiBBE3Sftuve22tspUg55M98oHA3NnrF2JEjsB0/Ds5kAgBI09Kgf+qpQQlsyvM8YsZOQvoDc1F/phxVR0pQUfIFTn+xG2pDNMKTpyE4egI4VnyFZ+sKoeVi+wRDvRZiKQWxjPYeUgpShQhqveyKdhpiGdVTltIgaQIEgR6Pi68cqBKEr817kSCu7NO3zXtPz32A1+NF9BrbL74Cfa6oD7b3x+6x9xFavcVWp7OXEHN0ws1d9Z0KmqChk+mgl+mhl+nRZm/Dhu82AABktAz3GO7B5PDJmBQ6Cen6dIjIwRVslFoN/dJfQ/fE4zD97W/ofH8zmp5/AeK4OOieeRqaX/5yQD/bw5m1a9eG7t69OwgAWlpaRBcuXJBER0e7Dh48qEhPT3fW1NRIc3JyrGvWrDFUVFTIx48fnwoATqeTDAkJYQCAoig8+eSTRv+Ye/fuVa1fvz7M6XSS3d3ddFpamqOjo8Nqs9nI7OxsGwA88cQTXQcOHAgCgOLiYnVVVZX8s88+0wKAxWKhKisrpf0JtKlTp9qWLl0a5/F4yIcfftiYlZXlAACRSMQvXLjQBAAZGRm24uJiNQBcunRJPG/evKj29naR2+0mo6OjA7F8cnNzu5VKJa9UKpnMzEzz119/rYiLi+suLS1VvvPOO439PCvDrFmzukeNGuW5nWcuCLSfOBJKgvuj78f9L9wPx4IGVBc+j8WfV+HkxYP4Y85+eBQSZEVmISc2BzOiZkAjubN59HqjDVNgTsF4XCpvx9efXMDnG05DmcSDzOpAp7g5IML8gszO2K8aQylSggABlmfB8Vyf83BCJVYhTBGGcEU4wuRhCFP0HOGKcITKQ++4t+BmYC0W2I8fD6T/cdU3giNFIMIiIX3gF5BNmARx+jhAroLbw8HuZsGe7Qbj5sB6ODAeDoyHDZRZd9864+HAelhv2c2BZTgwbtbXzgXOfZkMWjEOhLsKlq4zMB/5EMAnoKWjIdPcA3lQPCQ+ISXXSHqElbRHdF0ptEQSCpQE8FAuuDgXnKwTDsYBJ9P3bOyn3WlywtHpAEVQkNJS70Fd43xFWUbJIKElgTaSGHovspNx9hFXvcXWlW0OxnHV/SRBQivRQi/TI1gWjHhNPIJlwQiWBgeEmL+slqives/dzm6cbD2JEy0nUNpaiv889Z8AvIJtYsjEgIctLTgNNDk4X2mkVArtwoUIWrAAli++QOd7f0bL7/8FHUUbEL7mP6CcNm1QXrc/buTpGgx27dqlOnTokKqsrKxKpVJxkydPHu1wOMgFCxZ07dixQ5uSkuLMzc01kiQJnueJBQsWdL711ltNV44jFos5/7ozu91OLF++PPb48eOViYmJnsLCwgin03ndDzzP88S6devq8/LyzDeyOTc313r48OHzn376qebpp5+OLygoaC0oKOikaZonfd5PmqbBMAwBAAUFBTEvvPBCS35+vmnXrl2q1atXR/jHuvKHDUEQqKysFIeHh7ulUulVP/OOHTumLC0tVW7evDnEbreTHo+HVCqV7Ntvv33VM7kegkAbQcgiojF22050vv8+Mv6zCP/dHoSjiyfho84KfNXwFWiCxpTwKciOzcaDMQ9CJx34reZ2j72Px6t32V/vdHSCHwWMk8/AxJqfg/pBi7MR5WhMOo1gtRZJ2iRMj5yOEHlI4AiVhyJEHnLdqds+go1jrxZx3NWi7qoz5z3313bde3uNbXFb0GJr8R72FpS3l6Pb1XfXJQECwbJgr4Dzizd5j4ALU4QhWBY8YF/grIdDR5MVrZfMaG+wBKbuGDcDj8kKj9kGj80JxsOBI0XgqHvBxU8DH9/r9bsBfAngyws39ZokSYASk6BFJCgRCVpEgRaToGgStJiEXC32tfv7UN6zmARJAxBzIMQcOBEDTsSAFYWAoafC1FqH9lOnYS6/AEtrJWweBchxkWBS9XBIuR4h5RdXZiccXb1EFuuEk3HesqgnQHhFFi2DhJKA47nAWC72xwXWlVASSChJYFwpJYWElkBGyQKiTkJJ+l7zlftc60ccUgSFLldXH6/WldOMnY5OWDyWfm0LkgQFhNU4wzgEy64WXMGyYGglWlC3sTs4SBqEmbEzMTN2JgCgy9mFspYylLaUorSlFG+eehMAIKflmBg6EZPDJuPesHuRoksZcMFGUBTUublQzZ4N+7ffovPP/w1x9NUBxH9qdHd3UxqNhlWpVNx3330nPX36tAIA8vPzuydOnBh+5swZ95o1axoBYPbs2eb58+cnrly5sjUyMpJpbW2lTCYTlZyc3MfLZbfbSQAICwtjTCYT+fnnn2sfeugho16vZxUKBVdSUqJ48MEHbVu3bg18EeXk5Jg2btxomDNnjkUikfDl5eWSuLg4j1qtvmqdwYULF8QJCQnu5cuXd7hcLuLUqVNyAJ3Xeo8Wi4WKiYnxAMAHH3zQZ+pm7969Qa+88kqz2Wwmjx07pnrjjTeaduzYETRr1qx+heJnn312yV/2r827VXEGCAJtxEFQFPRLlkAxdSour3gJ0//jC8z99RK0LpyB4qavcKDuAP707Z/wr8f+FRNDJiI7NhvZMdkIVVw//hTHc+hydnkFl61/8dVmb+v3j71KpAoIrYTwhD6CS8MGo/kgC6psJqZZc5F5G2E5/FOTALxJye4gPMOA6ewE09YGzmYDkA4oCEABIISAi3XB6DKi09mJLpcRXY4udNi70NXViU7nGXzvPASn/0ue8M7MUSQFnVQHnTwYeqkewfJgBMsM0MuCoZMFQy83QEHLggjedAAAIABJREFUfQFbA/Nq4HnA3M2gvcWN9lYP2pvd6Gx3g/PpEamchAQuEHYrYDGC9DhBcwxkKjkkoTpIwnWQhhlAS0V9hVVATFGBsl+A9WkTkQDFw8ZaYXFbYHFbYHKbYHGbAnWz29zv2eK2wGKy3Fjw6AFqBoHYFjmSG5wI+8oG7tB5dIcxaEkgYIkSQyr2Chm1RI1QeWiPF8sncGS0rEfc+LxbvetSWgo5Le8jhq71ueR4Di7W5RWGjBMO1gEX09cz5xdyvUVi4MxcUWedMNvNfa/7xrodlCJlQFgla5Ohj9AH6v5zsNR7DJWHVyfVBZZwAECHowNlrWUoaynDiZYTWH9yfeC9+AXbpLBJSNGm3JZQ7A1BEN6sC1lZAzLecCcvL8/07rvvGhISEtITEhKc48ePtwGAwWBgExMTndXV1bIHHnjADgAZGRnOl19+uWnmzJnJHMdBJBLxRUVF9VcKNL1ez+bn57enpqamGwwGxj8mAGzatKl22bJlsSRJIjMz06JSqVgAePHFFztqa2slY8eOTeV5ntDpdJ49e/bU9Gfz/v37VUVFRWE0TfNyuZzdtm3bpf76+Vm1atXlRYsWjdJoNMz06dMt9fX1gW3Kqamp9qysrNFGo5FesWJFc1xcnOfAgQOajRs39ruubKAQNgmMYDibDS3//u8wffpXSMeNQ+Trr0EUHY3zxvOBwLg1Ju9nf5xhHHJicmCQG/r1frXb28HwfYOnUgSFYFlwQGz19nb1bruZ9VjDNSyHV3h1gWlvB9PW1nP46p72NjBt7d5I+EP0f80tUsKsjoNZFQezOhZmVRwY3zOnWBdUljqozXVQm2uhttRC4uoGAUAUGRmIR6aYOqVPPD2O52D19Agsi9sCs8vcI6Q83npvwdVbZPU3Pd0bmqChEquglqihEqmgEqt66mIVlCJlQCzJaFkfj1Kgneq5bm/tQMWhYpz9qhh2UzcUWh3G3J+NMffnICgsfDAf/x2F53m4OXefKddriT6GY6CT6nrElzR4WG0e+rF0ODoC3rXSllLUmmsBeH8IZoRmBKZER+tGD4sp5H4Ylrs47yQmk4nUaDQcAKxcuTKsublZtHnz5js+tQsAhYWFEUqlkl29enWrv82/QaGiouLc7Y4v7OIUuC7mffvQ/C9/ABgGob//PTTzfhnwBFw0XQyItXNdPZ9FOS3vEVuKqwVYiDwEwdLgAfvFCvQKy/G/NXBYPUjNCsfUXw5OWA6eZcF2dcHTR3C1XyXAmM5Ob2Ls3hAEqOBg0CEGiAwhoENCQBsM3nNICMje6WR8oQ8ClZ7YC33CMty4Dw+O49BlN6Ox3Ya2Dg6mbhGcFiV4j8J3JwcHdRl2qg42uhYOqg5OsgUqsQIasQZBkiBoJUFQSzTojg9Gu5aE2WPpK8J8QsvqtnpDSFwDAgSUYiXUYq+g8p/7K/eu+9tktGxQFrSzDINL35XhTMl+XPruJHieQ3T6OIx9cBaSJmeBHiELvkcSbfa2PoKt3uJ1eqjFamSEZgSmRJO0Sbcs2DiW9W7qGNgdnSNeoL333nvadevWhbMsS0RGRrq2b99eGxERwdz4zoGnP4E2kAgCTeCGeJqbcfn//Rb20lKof5GLsD/+EZS6r4fqsvUynIwTIfIQKMXKIbIUcDkYlO2+hPKSxlsOy8FzHNiuHo/XNQVYZyfAXr0GySu8QkAb9KBDQiAKuVqA0cHB3qTUgwzH8TA229B6yYzWWu/RddkG3pd2R6WTIjRejZA4NULj1DDEqMDTLFptrWixt6DZ1hxYC9e7bPV4k7nLafk1hVRvz5ZarA54tvyHUqQcrt6JAJauDlQeKsGZL7+AqbUFEoUCqdMfwNgHZyEkLmGozRMYJFpsLShtKUVZaxlONJ9Ao9W7CU8j0WBS6KSAhy0xKLHfzzDHsmg8V4ELx75B9Ylv8VDh7xCVkj6QJo54gXYrnDhxQvb444/3ST4tFou58vLyqqGy6VYQBJrATcGzLDrf+zPa/+u/QIcYEPnqq5BPmjTUZl0TY4sN33xSjfqzXQgKlSFrVgjCgz19phu9Aqy9p62jA2Cu/iFGabU9AivEEBBcfQSYXj9kgX55nofV6ELrJTPafGKsrd4CxuUVkRI5HRBioXFeUfZjPYt2jx0iSjToIQyGCzzHoaHyDM6UfIHqE0fBejwITUjE2AdnIWXaDEjkwyKBtsAg0WxtRmlrj4etyepdy62VaDEpbBImhU7CJEMGJM0OVB8/guoT38Ju6gYtkSBh4mRM/uXDtxR/7yYQBNoIQhBoAreEo7wcTStegqexEfplS6F/9tk74hHyw/M8OJsdrLHL6+0yGsF2Gb11oxFMlxFsl7fsMRrRyobgQtQcOGQG6DrPQmVtAM04QDN2SMQkJGoppBoZpDoVZIYgyMJ0EIcaesSXXg9imE1tuewetNVaAp6xtloz7GbvGluSJqCPUiE0vkeQaUIGZ0pwpOGwWnDu669QUbIf7fW1oMUSJE+dhrEPzkJkSrrwjEcATdYmr1i7fAI/VJRBfcmF2BY5ZG4KHE1AMToGY6fPxJSsXIilssEwQRBoIwhBoAGwWs9DLNZDLL524FOBHlirDa2vvALT//4vZOPHI+L11370dnKeZcGaTF6x1dUF1tgN1tir3NXlrQfKRvDuq4NYAgAhEoHS6UDpdKC1WlBaLSidDkSQDj/YInD+sgJOF3HD9fi0hIJERkMip70xsnznnrIIErk3NpZETvcpi2X0gGY5YBkOHY1Wr2fMN13Z3dqziD4oVB7wioXGq6GPVHp3QgoMGjzPo/XiDzhTsh9VRw7B7XBAGx6JMQ/kIH3GTCiCtENtosAg0Hv68sLxo3CYTaDEYkiTI9EcxeKo9Dxa3G0AgGBpMO4NuxdPjnkS6cHCFKfAj2PECzSXqx1Hjk5HdNQTSEpaOSBjjhTMe/ag+Q9/BDgOYX/4F2jmzgXncl0ltq70dAXKXV1gTaZr7mAkVSrv9KJPaFFaLWidFpRW5xNfWtC+dkqrA6mQ39CLwfO+FD4OBi67N42P23d22Rm4HZ5+2112T6DtRv8tRBKqr4C7lrDrLf581502Tx/PWHuDBRzjfUGZSoTQeE3AM2aIVUGqGBlTjcMVj9OJC8eP4EzJfjRVVYKkKCRMnIyxD85C3PiJIKk7HLNFYEDxi7Lz336N6hPfwmE2gZZIMGriZCRnTkf8PRmBXK88z6PB0oDSllJv4NyWUrw24zVkhGYMpEmCQBtBjHiBBgCVlS+htW03sjK/gkQSMmDjjgQ8TU1o+u1v4Sg7CUImA++4RqwlivKJrSCvwNLpQGmDQPcu+7xfVJC333CbWgR6BJ5XzPUWeZ4eMddH3Hn69HU7bizwAIAWkwiJ7VnEHxqvhlJ77VhaAkNPZ1MDKr48gLOHDsJhNkGpC/aG63ggB5qQsKE2T+AmuRVRdj38358D/H9WEGgjCEGgAbDb63DseA4iI/MxOvkPAzbuSIFnWRh3fAhPQ4NvivEKr5dWC1KtHswEwncN/iTc/XnsXHYGIjGFkDg1dOFykEJC+LsSlvHg4slSnPnyC9R+fwo8zyFmzHiMfXAWEu/NFMJ1DEM4lkVD5ZnA7kuH2QSRRIqEiffekii7AwgCbRCRy+UT7Hb7dwM13n333Ze0efPm2urqaklBQUEsTdN8WVnZOY1GMzEpKckBABEREe6SkpIf+rtfSJYOQC6PRXhYHpqaPkRszK8hlf50glPeCQiKgu6x/KE2466AIIlAbkfVwGfLEhgGULQISVOykDQlC+aOdpw9VIyKL4uxu+g1SBVKpN73AKLTx0EdbIBKb4BMpRY8o0PAXSTKRhwMw4C+g5vPBgOr1UoYjUZ61KhRnt///vfhhYWFzc8++2wXAEgkEq6qqqrydsa/u5/OLRIXV4Dmlv9Fbd3bSBn9r0NtjoCAwE8Atd6AzLxFmPr3/4D6inKcKdmP8gN78d3ezwN9aIkE6mAD1IYQqPQGqPUhUPvOKr0BSl0wqLv8y2q4IIiywSE7O3tUc3Oz2OVykcuWLWvlOI6oqamRbNq0qRHoyTm5ZcuW+rffflu3cePGUI/HQ0ycONG2ZcuWOpqmIZfLJ+Tn57cfPnxYXVRUVH/gwAHVvn37glwuFzlp0iTrtm3b6kiSxKFDh+RLliyJI0kSM2bMMJeUlGiqq6vPMgyD5557LurIkSMqt9tNLFmypO2ll17q17NYV1cnysvLS7BarRTLssSGDRvqZs+ebQWA3/zmN5FffPGFRiqVcrt27fohOjqa2b59u2bNmjXhHo+H1Gq1zEcffXQxOjqaKSwsjLh48aKktrZWYjQa6eeff75l+fLlHQCwZ88e1bRp0yzr16/X7969W3fo0CHNvn37NL1zcd4OI+ovgkwWiYiIR3D58seIjVkKmSxqqE0SEBD4iUCQJGLH3YPYcffAZbehu6UZ5vY2mDvaYelsg7m9HeaOdrReqoHDbOp7L0FCodP1K+L8ZYn8xinRRirXE2WjM3+GuHsm/mRE2f6Nb0Z3NNQN6IdBHx1r//k//tN1Uylt27atNjQ0lLVarcSECRPSSkpKzk+bNi0FQCMA7Ny5U7dq1armU6dOSXfu3KkrKyurkkgk/GOPPRbzzjvvBBcUFHQ6HA5yypQptvfee68RAO655x7H66+/3gwA8+bNi//www81jz76qGnx4sXxGzdurM3OzrY9++yzkX4b3nzzTb1Go2ErKirO+dMtPfTQQ+aUlJSrtv2///77upkzZ5rWrl3bwjAMLBYLCQAOh4PMzMy0btiwoWnZsmVRGzZsMLz66qvNOTk51oULF1aRJIn169frV69eHea389y5c7KTJ0+es1gs1IQJE9Ly8vJMcXFxnj179mjmz5/fPXfuXMuRI0eUc+bMMT311FNGAHC73eSYMWNSKYriV6xY0fKrX/2q+1b/XUaUQAOAuLhn0dz8CWpr30Jq6n8MtTkCAgI/QSRyBUITEhGakNjvdY/bBUuHV7B5z23ec3sbmn84jwvHjoBjmavGDAg2QwhUwQavF84n6JRBuhG1BnQkibLhwNq1a0N3794dBAAtLS2iCxcuSKKjo10HDx5UpKenO2tqaqQ5OTnWNWvWGCoqKuTjx49PBQCn00mGhIQwAEBRFJ588kmjf8y9e/eq1q9fH+Z0Osnu7m46LS3N0dHRYbXZbGR2drYNAJ544omuAwcOBAFAcXGxuqqqSv7ZZ59pAcBisVCVlZXS/gTa1KlTbUuXLo3zeDzkww8/bMzKynIAgEgk4hcuXGgCgIyMDFtxcbEaAC5duiSeN29eVHt7u8jtdpPR0dEu/1i5ubndSqWSVyqVTGZmpvnrr79WxMXFdZeWlirfeeedxv6eV3V1dXl8fLynsrJSnJOTM3rixImO9PR0V399r8WIE2hSSRgiIx5FY9NWxMYuhVweN9QmCQgIjDBEYgl0EVHQRfTvxec5DrZuI8y9xVuvctP5Srhstj73kBQNVXBwYNrUK+ZCekRcsP6uFywBUfbtN6guHZmi7EaersFg165dqkOHDqnKysqqVCoVN3ny5NEOh4NcsGBB144dO7QpKSnO3NxcI0mS4HmeWLBgQedbb73VdOU4YrGY8687s9vtxPLly2OPHz9emZiY6CksLIxwOp3X/YXB8zyxbt26+ry8PPONbM7NzbUePnz4/Keffqp5+umn4wsKCloLCgo6aZrmSd8PGZqmwTAMAQAFBQUxL7zwQkt+fr5p165dqtWrV0f4x7py/ShBEKisrBSHh4e7pVJpvzst4+PjPQCQlpbmnjp1quXEiRNyQaDdBLGxy9B0+UNcqt2A9LR1Q22OgICAQB8IkoRSFwylLhgRySn99nHZ7bB09vW+eadT29Fw9gysXZ3gea7PPTK1xivcgr3r3kQSCSiRGLRIBMp/0CLQYrHv7KuLxD3XRT1175kGJRKBJAcnHlwfUXbiKBwWs1eUZUzG6KnTR4QoG2q6u7spjUbDqlQq7rvvvpOePn1aAQD5+fndEydODD9z5ox7zZo1jQAwe/Zs8/z58xNXrlzZGhkZybS2tlImk4lKTk7u4+Wy2+0kAISFhTEmk4n8/PPPtQ899JBRr9ezCoWCKykpUTz44IO2rVu3BrZa5eTkmDZu3GiYM2eORSKR8OXl5ZK4uDiPWq3u+0EHcOHCBXFCQoJ7+fLlHS6Xizh16pQcQOe13qPFYqFiYmI8APDBBx/0iWi/d+/eoFdeeaXZbDaTx44dU73xxhtNO3bsCJo1a1a/QrG9vZ1SKpWcTCbjm5ub6bKyMuXKlStbbvqB+xiRAk0iMSAq6jHU17+PuNhnoVAMaB41AQEBgUFHIpdDIo+FPjq23+scy8La1QlzR1ufqVRzRzuMzU1oOFsOxuMG6/EMiD0kRYHyCzeaBuUXeT4R1yPy/OLuSrHnFYM994nQevGH/kXZhAyIxJIBsVvgxuTl5ZneffddQ0JCQnpCQoJz/PjxNgAwGAxsYmKis7q6WvbAAw/YASAjI8P58ssvN82cOTOZ4ziIRCK+qKio/kqBptfr2fz8/PbU1NR0g8HA+McEgE2bNtUuW7YsliRJZGZmWlQqFQsAL774Ykdtba1k7NixqTzPEzqdzrNnz56a/mzev3+/qqioKIymaV4ul7Pbtm277sL9VatWXV60aNEojUbDTJ8+3VJfXx/4gKWmptqzsrJGG41GesWKFc1xcXGeAwcOaDZu3Fjf31jff/+99LnnnoslCAI8z+Of/umfWjIyMpw3+7z9jJg4aFfidnfi6Lf3Qx/8IMaM+c9Bex0BAQGB4QzP82AZBqzHA9bjBuM7sx6Pr+zxCjnG06et3z4eD1jGA8btPbNuNxjG379XH/99vfu4PVd5/EaoKBvxcdBMJhOp0Wg4AFi5cmVYc3OzaPPmzXd8ahcACgsLI5RKJbt69epWf5t/g0JFRcW52x1fiIPWD2JxMKKinkBd3TuIsz4LpXL0UJskICAgcMchCAK0z4MFDO1OUY5lAyKO8bghU6qFoL8jkI8//lizbt26cJZlicjISNf27dtrh9qm3shkMn4gxNmNGLEeNADweLpx5OgM6HTTMW7sW4P6WgICAgICAjfBiPeg3QonTpyQPf744/G928RiMVdeXl41VDbdCoIH7RqIREGIiX4al2qLYLFUQqVKG2qTBAQEBAQEBG6SyZMnO243Yv9wZeQEzbkG0dFPgabVuHjpzaE2RUBAQEBAQEAAgCDQIBKpEROzGB0dB2E2lw+1OQICAgICAgICgkADgOioJyASaXHx4htDbYqAgICAgICAgCDQAICmlYiNWYLOrsPoNp0canMEBAQEBAQERjiCQPMRFfUriETBghdNQEBAQEDgJ4pcLp8wkOPdd999STU1NaJ9+/YpExMT01NSUtKsVitRXV0tnjZtWlJCQkL6qFGj0s+fP3/L8WIEgeaDouSIi/tHGI3fwmg8NtTmCAgICAgI/GRhGGaoTbhtrFYrYTQa6VGjRnm2bNmiKywsbK6qqqpUKpV8fn5+/IoVK1ovXrx49tSpU+ciIiJu+Q0LAq0XkRGLIBGH4uLFNzFc48MJCAgICAjcabKzs0elp6enJiYmpr/++uv6V1991bB06dIo//WioqLgxx9/PAYA3n77bd3YsWNTU1JS0h599NFYvxiTy+UTlixZEjV69Oi0gwcPKlesWBE+ZsyY1KSkpPRFixbFcpw3k8ShQ4fkycnJaSkpKWlLly6NSkpKSge8om7p0qVRY8aMSU1OTk577bXX9Neyt66uTjRp0qTRKSkpaUlJSen79u1T+q/95je/iRw9enTa+PHjUxoaGmgA2L59u2bcuHEpqampaVlZWcn+9sLCwoh58+bF33PPPSmxsbFj1q1bF3jNPXv2qKZNm2ZZv369fvfu3bpXXnklcu7cufEnT56UsiyLv//7vzcDgEaj4VQq1VX5Qm/EbcVBIwhCB+AjAHEAagE8wvO8sZ9++wBMBfANz/Nzbuc1BxOKkiI27h9x4cIf0WU8gmDd9KE2SUBAQEBAIEDXzgvRnhbbgKZ8EIUp7LqHk6+bSmnbtm21oaGhrNVqJSZMmJBWUlJyftq0aSkAGgFg586dulWrVjWfOnVKunPnTl1ZWVmVRCLhH3vssZh33nknuKCgoNPhcJBTpkyxvffee40AcM899zhef/31ZgCYN29e/Icffqh59NFHTYsXL47fuHFjbXZ2tu3ZZ5+N9Nvw5ptv6jUaDVtRUXHOn27poYceMqekpLivtPf999/XzZw507R27doWhmFgsVhIAHA4HGRmZqZ1w4YNTcuWLYvasGGD4dVXX23OycmxLly4sIokSaxfv16/evXqML+d586dk508efKcxWKhJkyYkJaXl2eKi4vz7NmzRzN//vzuuXPnWo4cOaKcM2eO6amnnjJu3bo1SK1Ws7NmzRrV0NAgue+++8xvvfVWI03fmuS6XQ/aPwM4yPN8EoCDvnp/vAbgV7f5WneEyIhHIJGEC140AQEBAQEBH2vXrg0dPXp0WkZGRmpLS4vowoULkujoaNfBgwcVLS0tVE1NjTQnJ8e6b98+VUVFhXz8+PGpKSkpad9884364sWLEgCgKApPPvlkwImzd+9e1bhx41KSk5PTjh49qqqoqJB1dHRQNpuNzM7OtgHAE0880eXvX1xcrP7444+DU1JS0iZMmJBqNBrpyspKaX/2Tp061bZjxw59YWFhxIkTJ2RarZYDAJFIxC9cuNAEABkZGba6ujoxAFy6dEn8s5/9LCk5OTmtqKgorKqqSuYfKzc3t1upVPLh4eFMZmam+euvv1YAQGlpqXLWrFnWK1+bYRiirKxM+eabbzaUl5dX1tbWSjZs2HBNb9+1uN1MAr8EcL+v/BcAXwH47ZWdeJ4/SBDE/Ve2D0dIUoL4uOdQdf5ldHZ+Bb3+gaE2SUBAQEBAAABwI0/XYLBr1y7VoUOHVGVlZVUqlYqbPHnyaIfDQS5YsKBrx44d2pSUFGdubq6RJEnwPE8sWLCg86233mq6chyxWMz5vUh2u51Yvnx57PHjxysTExM9hYWFEU6n87pOI57niXXr1tXn5eWZb2Rzbm6u9fDhw+c//fRTzdNPPx1fUFDQWlBQ0EnTNE+S3pehaRoMwxAAUFBQEPPCCy+05Ofnm3bt2qVavXp1hH8sguibfYsgCFRWVorDw8PdUqn0Kk9OTEyMOyUlxZGWluYGgLlz5xqPHTumvLLfjbhdD1ooz/PNvnILgNDbHG9YEB7+MKTSaFy8JHjRBAQEBARGNt3d3ZRGo2FVKhX33XffSU+fPq0AgPz8/O79+/cHffLJJ7r8/PwuAJg9e7Z5165d2qamJhoAWltbqQsXLly1g9Fut5MAEBYWxphMJvLzzz/XAoBer2cVCgVXUlKiAICtW7fq/Pfk5OSYNm7caHC5XAQAlJeXS8xmc7865sKFC+KoqCjP8uXLOx5//PH2U6dOXXda2GKxUDExMR4A+OCDD4J7X9u7d2+Q3W4nWlpaqGPHjqmmT59u+7//+z/NrFmz+hWKM2bMsJnNZury5cs0AHz55ZfqtLQ0x/Vevz9u6EEjCKIYQFg/l1b1rvA8zxMEcVtqhiCIXwP4NQDExMTczlC3BUmKEB9fgHPnfouOjmIYDDlDZouAgICAgMBQkpeXZ3r33XcNCQkJ6QkJCc7x48fbAMBgMLCJiYnO6upq2QMPPGAHgIyMDOfLL7/cNHPmzGSO4yASifiioqL65OTkPuvE9Ho9m5+f356amppuMBgY/5gAsGnTptply5bFkiSJzMxMi0qlYgHgxRdf7KitrZWMHTs2led5QqfTefbs2VPTn8379+9XFRUVhdE0zcvlcnbbtm2XrvceV61adXnRokWjNBoNM336dEt9fb3Efy01NdWelZU12mg00itWrGiOi4vzHDhwQLNx48b6/saiaRpr1qxpvP/++5MBYOzYsfYXX3zxlpPdE7fjISII4jyA+3mebyYIIhzAVzzPj75G3/sBrLjZTQKTJk3iy8rKfrRttwvHMTh2/OegKCkm3/s5CELY8CogICAgMOgQVzacPn26dvz48bf8BX+3YjKZSI1GwwHAypUrw5qbm0WbN2++41O7gHcXp1KpZFevXt3qb/NvUKioqDh3u+OfPn1aP378+Lj+rt2u6vgMwBO+8hMA/u82xxs2kCSNhPjnYbVWoa1931CbIyAgICAgMCL4+OOPNf7wGEePHlW+8sorzTe+684hk8n4gRBnN+J2PWjBAD4GEAOgDt4wG10EQUwCsIzn+cW+fl8DSAGgBNAJ4Bme5/dfb+yh9qABAM+zOHb8FyAIAlMm7wZBUENqj4CAgIDAT54R70G7FU6cOCF7/PHH43u3icVirry8vGqobLoVrudBu61dnDzPdwKY2U97GYDFveo/u53XGSoIgkJC/POoOPs8Wlt3Iyxs7lCbJCAgICAgIOBj8uTJjqqqqsqhtmMwEBZW3YCQkFwolSm4eOk/wXF3f2oKAQEBAQEBgeGPINBuAEGQSIh/AQ5HLVpbfzJL7AQEBAQEBASGMYJAuwn0+hyoVOm4dOm/wHGeoTZHQEBAQEBA4CeOINBuAoIgkBD/IhzOejQ3fzrU5ggICAgICAj8xBEE2k0SHHw/1Op7UFv7FjjONdTmCAgICAgICNwicrl8wkCOd9999yXV1NSI9u3bp0xMTExPSUlJ+/jjj9UpKSlp/kMikUzcunVr0K2OLQi0m4QgCCQkvAin6zIuX/5kqM0REBAQEBC4a2GYu3/TndVqJYxGIz1q1CjPli1bdIWFhc1VVVWVjzzyiLmqqqqyqqqq8tChQ+elUik3b968G+YPvZLbTZY+otBpp0GjmYTa2rcRHv4wKEo61CYJCAgICIwg/va3v0W3tbVdN6/krRISEmKfN2/edSP1Z2dnj2pubha7XC7j4O8HAAAgAElEQVRy2bJlrRzHETU1NZJNmzY1AkBRUVFwWVmZYsuWLfVvv/22buPGjaEej4eYOHGibcuWLXU0TUMul0/Iz89vP3z4sLqoqKj+wIEDqn379gW5XC5y0qRJ1m3bttWRJIlDhw7JlyxZEkeSJGbMmGEuKSnRVFdXn2UYBs8991zUkSNHVG63m1iyZEnbSy+91G98uLq6OlFeXl6C1WqlWJYlNmzYUDd79mwrAPzmN7+J/OKLLzRSqZTbtWvXD9HR0cz27ds1a9asCfd4PKRWq2U++uiji9HR0UxhYWHExYsXJbW1tRKj0Ug///zzLcuXL+8AgD179qimTZtmWb9+vX737t26Q4cOafbt26f57LPPAmmltm7dqp0xY4ZJpVJxt/rvInjQbgGCIDAq4UW43K1ourxjqM0REBAQEBC4I2zbtq327Nmz577//vvKTZs2hS5atMi4d+/ewLTdzp07dfn5+V2nTp2S7ty5U1dWVlZVVVVVSZIk/8477wQDgMPhIKdMmWI7f/585c9//nPrSy+91FZRUXGuurr6rMPhID/88EMNACxevDj+7bffrquqqqqkKCoQTf/NN9/UazQatqKi4tzp06fP/eUvfzFUVVVdlYgdAN5//33dzJkzTVVVVZXnzp07O2XKFLvfhszMTOv58+crMzMzrRs2bDAAQE5OjvX777+vOnfuXOXDDz/ctXr16kAO8nPnzsm++eab88eOHat67bXXImpra0UAsGfPHs0vfvELU2FhYUd2dnb3v/3bvzX2Fmf+57Jo0aKuH/PMBQ/aLaLVToVWm4m6uncQGbEQFCUbapMEBAQEBEYIN/J0DRZr164N3b17dxAAtLS0iC5cuCCJjo52HTx4UJGenu6sqamR5uTkWNesWWOoqKiQjx8/PhUAnE4nGRISwgAARVF48sknjf4x9+7dq1q/fn2Y0+kku7u76bS0NEdHR4fVZrOR2dnZNgB44oknug4cOBAEAMXFxeqqqir5Z599pgUAi8VCVVZWSlNSUtxX2jt16lTb0qVL4zweD/nwww8bs7KyHAAgEon4hQsXmgAgIyPDVlxcrAaAS5cuiefNmxfV3t4ucrvdZHR0dGCxeW5ubrdSqeSVSiWTmZlp/vrrrxVxcXHdpaWlynfeeafxWs+srq5OdP78edn8+fNveXoTEDxoP4qE+H+C292BxsatQ22KgICAgIDAoLJr1y7VoUOHVGVlZVXnz5+vTE1NdTgcDnLBggVdO3bs0P7P//yPNjc310iSJHieJxYsWNDpX4NVW1tbsX79+suANwUTTXv9Qna7nVi+fHnsX//615oLFy5UPvbYYx1Op/O6moTneWLdunX1/rH/f3t3Hh91de9//HVmJttMkskyCQmQkARkVeOCKEirCCh4b2+t6G0t/Vm70FK3XtH29rb9bT6uv4e9/dV6caH6661VS1stWleWorjUBZEiQZYQhCwCE5KQfZJMMjPn90cmNKwCSZgkvJ+PB4/MfOfkO58cMsk753znnH379n18vPAzf/781rfffnvnqFGjOr/5zW8WPvzww5kALpfLOhzdT+NyuQiFQgbg9ttvz7/11ltrysrKtj/88MOVwWDwUC3GHL77ljGG7du3x+fm5nYmJiYed7/Mp556Kn3evHmNCQkJp7WnpgLaaUhLm0pGxueorHqcUKg11uWIiIgMmMbGRqfX6w2npKREPvroo8SSkhIPwMKFCxvXrFmT9qc//Slj4cKF9QDz5s1rfuWVV9L37dvnAjhw4ICzrKzsqGnItrY2B0BOTk6oqanJ8fLLL6cD+Hy+sMfjiaxbt84D8PTTT2f0fM7cuXObli1blhUMBg3Ali1bEpqbm4+ZY8rKyuJHjx7ddffdd9fdfPPNtZs2bTrhdXstLS3O/Pz8LoDf/va3mb0fW7VqVVpbW5uprq52rl+/PmXmzJmBF1980Xv11VefcGRsxYoVGV/96ldPa3oTNMV52oqK7mLjxuvZu/cpCgpujXU5IiIiA2LBggVNjz/+eFZRUdGUoqKijuLi4gBAVlZWeNy4cR27du1KmjVrVhvAxRdf3PHTn/503+zZs8dHIhHi4uLs0qVLq8aPH3/YNKTP5wsvXLiwdtKkSVOysrJCPecEeOyxxyoWL148xuFwMH369JaUlJQwwF133VVXUVGRcN55502y1pqMjIyulStX7j5WzWvWrElZunRpjsvlsm63O7x8+fLyY7Xr8ZOf/GT/TTfdNNbr9YZmzpzZUlVVldDz2KRJk9pmzJgxoaGhwXXPPff4CwoKutauXetdtmxZ1fHOt3Pnzni/3x9/7bXXtpxMHx+Lsfa0Rt4G3NSpU+3GjRtjXcYJlZQsorHpb1w+4y1crpRYlyMiIkOfOfJASUlJRXFx8THfrTgcNTU1ObxebwTgxz/+cY7f74974oknYnLt3ZIlS0YmJyeH77333gM9x9rb280ll1wycevWrTv6ev6SkhJfcXFxwbEe0xRnHxQV/QuhUBNVnz4R61JERESGhWeffdY7ceLEyeecc86U9957L/m+++7zx7qm3pKSkmx/hLPPohG0Ptry8a3U17/L5TPeIi7ulBcKFhER6e2sH0E7FRs2bEi6+eabC3sfi4+Pj2zZsqU0VjWdihONoOkatD4qKvw+tbV/oarq14wde0+syxERETlrTJs2rb20tHR7rOsYCJri7KPk5AlkZ1/Lp3ufpLPztN+sISIiInKIAlo/KCr8PuFwB5VVj8e6FBERERkGFND6gcczlpwR/8TevU8TDNbGuhwREREZ4hTQ+klh4R1Y20Vl1WOxLkVERESGOAW0fuJ2F5CTcz379i2nI1gd63JERETkCG63+8L+PN/nP//5c3bv3h23evXq5HHjxk2ZOHHi5NbWVrN48eLR48aNm1JUVDTllltuyYtEIqd8bgW0flRYcBvWRqioWBbrUkRERAatUCgU6xL6rLW11TQ0NLjGjh3b9dRTT2UsWbLEX1pauv399993b9iwIbm0tHRbWVnZts2bN3tWrlx5yqvZa5mNfpSUlMfI3BvZv/8ZCsZ8l8TEkbEuSUREhpHtO/41L9BadsJ9JU+VJ3l82+RJPzvhSv1z5swZ6/f744PBoGPx4sUHIpGI2b17d8Jjjz22F2Dp0qWZGzdu9Dz11FNVjz76aMayZctGdHV1mYsuuijw1FNPVbpcLtxu94ULFy6sffvtt1OXLl1atXbt2pTVq1enBYNBx9SpU1uXL19e6XA4eOutt9yLFi0qcDgcXHHFFc3r1q3z7tq1a1soFOK2224b/e6776Z0dnaaRYsW1fzgBz845vpwlZWVcQsWLChqbW11hsNh89BDD1XOmzevFeCOO+4Y9Ze//MWbmJgYeeWVVz7Jy8sL/f73v/fef//9uV1dXY709PTQM888sycvLy+0ZMmSkXv27EmoqKhIaGhocN15553Vd999dx3AypUrUy6//PKWBx54wPfqq69mvPXWW97Vq1d777zzzppgMGg6OjqMtdaEQiEzcuTIrlP9f9EIWj/r3pfTUF7xcKxLERER6RfLly+v2LZt247Nmzdvf+yxx0bcdNNNDatWrTq0OvuKFSsyFi5cWL9p06bEFStWZGzcuLG0tLR0u8PhsL/61a8yAdrb2x2XXnppYOfOnduvueaa1h/84Ac1W7du3bFr165t7e3tjj/+8Y9egG9/+9uFjz76aGVpael2p9N5aDX9Bx980Of1esNbt27dUVJSsuPJJ5/MKi0tPWojdoDf/OY3GbNnz24qLS3dvmPHjm2XXnppW08N06dPb925c+f26dOntz700ENZAHPnzm3dvHlz6Y4dO7bfcMMN9ffee29Oz7l27NiR9M477+xcv3596c9//vORFRUVcQArV670XnvttU1LliypmzNnTuO///u/733ppZfK58yZE7j88stbcnNzi0eOHHn+rFmzmi+66KKOU+1zjaD1s8TEkYwa9WX27fsDBWMWk5SUH+uSRERkmPiska6B8rOf/WzEq6++mgZQXV0dV1ZWlpCXlxd8/fXXPVOmTOnYvXt34ty5c1vvv//+rK1bt7qLi4snAXR0dDiys7NDAE6nk1tuuaWh55yrVq1KeeCBB3I6OjocjY2NrsmTJ7fX1dW1BgIBx5w5cwIAX//61+vXrl2bBvDaa6+llpaWul966aV0gJaWFuf27dsTJ06c2HlkvZdddlngu9/9bkFXV5fjhhtuaJgxY0Y7QFxcnP3KV77SBHDxxRcHXnvttVSA8vLy+Ouuu250bW1tXGdnpyMvLy/Yc6758+c3Jicn2+Tk5ND06dOb//rXv3oKCgoaP/zww+Rf/epXe4987q1btyaUlZUl7t27dwvAFVdcMX716tXJPSN4J0sjaAOgYMytGOOkvPyhWJciIiLSJ6+88krKW2+9lbJx48bSnTt3bp80aVJ7e3u748Ybb6z/wx/+kP673/0uff78+Q0OhwNrrbnxxhsPlpaWbi8tLd1eUVGx9YEHHtgP3VswuVzd40JtbW3m7rvvHvP888/vLisr2/61r32trqOj44SZxFprfvGLX1T1nHvfvn0fX3/99c3Hajt//vzWt99+e+eoUaM6v/nNbxY+/PDDmQAul8s6HN1P43K5CIVCBuD222/Pv/XWW2vKysq2P/zww5XBYPBQLcYcvvuWMYbt27fH5+bmdiYmJh61X+YzzzyTdskllwS8Xm/E6/VG5syZ0/TOO+94TqHLAQW0AZGQkM3oUV/DX/0CbW3lsS5HRETktDU2Njq9Xm84JSUl8tFHHyWWlJR4ABYuXNi4Zs2atD/96U8ZCxcurAeYN29e8yuvvJK+b98+F8CBAwecZWVlR01DtrW1OQBycnJCTU1NjpdffjkdwOfzhT0eT2TdunUegKeffjqj53Pmzp3btGzZsqxgMGgAtmzZktDc3HzMHFNWVhY/evTorrvvvrvu5ptvrt20adMJr9traWlx5ufndwH89re/zez92KpVq9La2tpMdXW1c/369SkzZ84MvPjii96rr776mOEwPz+/8913303p6uoiGAyad999N2Xy5Mma4hwsxoz5Dnv3/Z495Us5d8ovY12OiIjIaVmwYEHT448/nlVUVDSlqKioo7i4OACQlZUVHjduXMeuXbuSZs2a1QZw8cUXd/z0pz/dN3v27PGRSIS4uDi7dOnSqvHjxx82Denz+cILFy6snTRp0pSsrKxQzzkBHnvssYrFixePcTgcTJ8+vSUlJSUMcNddd9VVVFQknHfeeZOstSYjI6Nr5cqVu49V85o1a1KWLl2a43K5rNvtDi9fvvyEoyU/+clP9t90001jvV5vaObMmS1VVVUJPY9NmjSpbcaMGRMaGhpc99xzj7+goKBr7dq13mXLllUd61zf+MY3Gt54443UCRMmTDHGMGvWrKavfvWrTSfb3z2MtUeNzg0KU6dOtRs3box1GX3yySf/QWXV41w6bSXJyeNjXY6IiAx+5sgDJSUlFcXFxcd8t+Jw1NTU5PB6vRGAH//4xzl+vz/uiSeeiMm1d0uWLBmZnJwcvvfeew/0HGtvbzeXXHLJxK1bt+7o6/lLSkp8xcXFBcd6TFOcA2jMmEU4nW7KK3QtmoiIyMl49tlnvRMnTpx8zjnnTHnvvfeS77vvPn+sa+otKSnJ9kc4+yya4hxAcXHp5OXdQkXFI7S03EpKyqRYlyQiIjKoLVq0qGHRokUNn90SNmzYkHTzzTcX9j4WHx8f2bJlS2l/1NLzBodYUEAbYPl532Lv3qfYU/4gxedrn04REZH+Mm3atPbS0tLtsa5jIGiKc4DFxXnJz/sWdXWv0dz8cazLERGRoScSiUSOujZNhrbo/+lxN+lUQDsD8vJuweVKY0/5g7EuRUREhp6ttbW1XoW04SMSiZja2lovsPV4bTTFeQa4XCmMyV/E7j0/p6lpE17vRbEuSUREhohQKPTt6urqX1dXV5+LBlaGiwiwNRQKfft4DbTMxhkSCgV47/0rSUmezIUXPhnrckREZHDSKJkASuJnjMvlYcyY71Lf8A4NjR/GuhwREREZxPo0xWmMyQCeAQqACuCfrbUNR7S5AFgGpAJh4D5r7TN9ed6havSohVRV/Zo9e37JRRcuP2p/r8EoEgnR1PwRdXWvc/DgmxgcpKYWk+q9AG/qBXg84zDGGesyRUREhpU+TXEaY/4DqLfW3m+M+RGQbq391yPajAestXaXMWYk8DdgkrW28UTnHm5TnD0+/fRJynbdy4UXPE1GxoxYl3NMoVALB+v/Sl3d69TVvUko1IgxcaSnTQPjoLl5C6FQ964VTqeH1JTzooGtmNTUC0hIyI7xVyAiMmQN/r/c5Yzoa0DbCVxprfUbY3KBN621Ez7jc0qAG6y1u07UbrgGtHA4yPvrryIxcSQXX/TsoBlFa2/fGw1kr9PQuAFru3C50vD5rsTnm01mxudwuVIAsNbS3l5BU9NmmptLaGreTGvrDqwNAZCQkIs39QJSvd2BLTXlXJzOpFh+eSIiQ8Xg+KUgMdfXd3GOsNb2bMFQDYw4UWNjzDQgHjjm5qZnA6czgYKC29i5879TX/82mZlXxKQOayM0N5dQGw1lgUAZAG73WPLybsHnm4039UIcjqO/RYwxuN2FuN2F5OZ+CegOnq2t22hqLqGp6SOam0uoqV0Vbe/E45nQPcIWnRp1u4swRpdAioiIHMtnjqAZY14Dco7x0E+AJ621ab3aNlhr049znlzgTeDr1tr1x2nzHeA7APn5+RdXVlaezNcw5EQinby/fg7xcZlMnfr8GRtFC4UC1De8Q13dOurq3qCr6yDGOPF6p5Llm43PdxVud+Fnn+gkdXbW0dRcQvOhkbYSwuFWoHvpkdSU83tdz1ZMfLyv355bRGSI0giaAGdoitMYk0p3OPs/1toVJ3Pu4TrF2WP//mfZUfpvnH/+42T5Zg/Y83R0+LsD2cHXaWh4n0ikE5crhcyMK7qnLjOvIC7OO2DP35u1Edra9tDU3B3YmptKaA2UYm0YgMTE0aSmFh+aHk1JPhenM+GM1CYiMkgooAnQ94D2c+BgrzcJZFhrf3hEm3hgFfCytfakl9If7gEtEuli/QdX43QmM+2SF/ttus/aCC0tW6OjZOtoad0GQFJSPj7fbHy+2aR5p+JwxPXL8/VVONxOS8u27tDWtJmm5s0Eg92z5sbEkZw8sTuwpRbj9V5AUlLBoLluT0RkAOgHnAB9D2iZwLNAPlBJ9zIb9caYqcBia+23jTFfA54AtvX61FustZtPdO7hHtAA/P4/s33HPZx37qNkZ19z2ucJhztoaHgvej3ZOjo7awAHXu+F0anL2bjdY4dMsAkGa2hu3vz36dGWjwmHAwC4XF5SU88/LLTFxR1zVl1EZCgaGj+oZcBpJ4EYsjbM+g/mYYyLS6e9ekqjaMFgLXUHu0fJ6uvfIRLpwOn0kJnxeXy+q8jMvJL4+IwBrP7MsTZMIPDJoXeMNjeX0NpaRs8es0lJ+aSmdl/HlpJyLg5Hz7Tosb+3be/jh33/2yM+nqjtqZzv8NsJCSP69Vo/ERlWFNAEUECLueoDL7Nt278wZcqD5Iz4wnHbWWtpbS09tBRGc8sWABITR+HzXYUvczbp6dN6hZPhLRQK0NKy9e8jbc0lBIPVsS7rpHk848nOmkd29jw8nvFDZnRTRAacfhgIoIAWc9ZG+GDDP2BtiMsuXX3YqvyRSJCGhg+i15O9TkdwP2BITS3uDmW+2SR7JuiXe1RHsJrW1r+/6QDAHPaz7jj9FO2/47c1R7U9uv1hJzxue4sl0LqTmto1NDZ+CFjc7kKys+aRlT2PlOQp+v8UObvpB4AACmiDQk3NGj7eeiuTJ/1fMjM/T93BNw5NXYbDARyOJDIyLifLN5vMzFkkJGTFumTpB8FgLbV1a6mpWUVj4wdYGyYxMY/s7GvIzppHamqx1ooTOfsooAmggDYoWBthw4dfpK2tnEikA7AkxI84NEqWnj4dpzMx1mXKAOrsrKeu7jVqaldTX/8e1naRkJBDVtY1ZGfPJ817kfY8FTk7KKAJoIA2aDQ0bGBP+YOkp12KL2u2prrOYl1dzdTVvR4Na28TiXQSH+/rDmtZ15CWdukxd3gQkWFBP/gFUEATGdRCoVYOHnyTmprV1B18k0iknbi4dLJ8c8nOnkd6+nQcjvhYlyki/UcBTQAFNJEhIxxu5+DBt6mpXU1d3TrC4VZcrlR8vtlkZ80jI+Nz2nlBZOhTQBNAAU1kSAqHgzQ0vEtNzSpq614jFGrG6fTgy5xFdvZ8MjM/j9PpjnWZInLqFNAEUEATGfIikU4aGtZTU7ua2tq1dHXV43Akkpl5JdlZ1+DzzcLlSol1mSJychTQBFBAExlWIpEQjU0fUlOzmtraNXR21uJwxJOR8Tmys+bh880mLs4b6zJF5PgU0ARQQBMZtqyN0NS0iZra1dTUrCYY9GOMi4z0GWRlzyPLN3fYbAcmMowooAmggCZyVrDW0tyypfuatZo1tHdUYYyTtLRpZGfNJyvrai2ALDI4KKAJoIAmctbp3td1OzU1q6mpXU1b2x7AkOadSlZ291priYkjY12myNlKAU0ABTSRs5q1lkBgV/cbDGpW0xrYCUCWby6FhXeQkjIlxhWKnHUU0ARQQBORXtrayvFXv8DevU8RCjXj882msOB2UlPPj3VpImcLBTQBFNBE5BhCoRY+/fRJqj79DaFQE5mZV1JYeCfe1OJYlyYy3CmgCaCAJiInEAq1sHfv76j69L/o6mogI+NzFBXeidd7UaxLExmuFNAEUEATkZMQCgXYt+93VFb9mq6uejLSL6ew8E7S0qbGujSR4UYBTQAFNBE5BeFwG3v3/Z7Kysfp6jpIetplFBbeSXr6pbEuTWS4UEATQAFNRE5DONzOvv1/pLLyMTo7a0lLm0ZhwR2kp0/HGP1+EekDvYAEUEATkT4IhzvYv/8ZKisfI9h5AK93KoWFd5CRfrmCmsjp0QtHAAU0EekH4XAQv/9PVFQuIxisxpt6YXdQy/i8gprIqdELRgAFNBHpR5FIkP3+56isWEZHcD+pqcUUFtxBZuaVCmoiJ0cvFAEU0ERkAEQinfj9z1NRuYyOjr2kpJxLYcEd+HyzFdRETkwvEAEU0ERkAEUiXVRXv0BFxaO0d1SRnDyZwsLbyfLNxRhHrMsTGYwU0ARQQBORMyASCXHgwIuUVzxCe3slyckTKSi4neysaxTURA6ngCaAApqInEGRSIgDNa9QUfEIbW178HjOobDgdrKz52OMM9bliQwGCmgCKKCJSAxYG+bAgVcpr3iEtrZPcLvHUVhwGyNG/IOCmpztFNAEUEATkRiyNkJNzSrKKx4mECjD7S6kYMxtjBjxBRwOV6zLE4kFBTQBFNBEZBCwNkJt7V8or3iI1tZSkpLGUFBwKzkjvojDERfr8kTOJAU0ARTQRGQQsTZCXd1rlJc/TEvrNpIS8yko+B45OV9SUJOzhQKaAApoIjIIWWupO7iO8vKHaGn5mMTEURSM+R65uQtwOOJjXZ7IQFJAE0ABTUQGMWstBw++SXnFQzQ3l5CQkEvBmO8xcuQNOBwJsS5PZCAooAmggCYiQ4C1lvr6v1JevpSm5o9ISMihoOA2RubeqKlPGW4U0ARQQBORIcRaS0PDe+ze80uamz8iKTGfwsI7yMn5opbnkOFCAU0A0BLeIjJkGGPIyLicqRf/ieLzf43Tlcz2HT9g/QfzOVCzEmsjsS5RRKRfKKCJyJBjjMHnm8W0S17kvHMfwRgHW7fewYYPv0hd3ToG68yAiMjJ6lNAM8ZkGGPWGmN2RT+mH6PNGGPMJmPMZmPMNmPM4r48p4hID2McZGfP49JprzJ58i8Ih1sp2bKIjX+7kfr6dxXURGTI6tM1aMaY/wDqrbX3G2N+BKRba//1iDbx0ecJGmOSga3ADGvt/hOdW9egicipikS68Pufo7ziIYLBatLSLmVs0RLS0qbGujSRk6Vr0ATo+xTnF4Eno7efBK47soG1ttNaG4zeTeiH5xQROSaHI45Ro77C9MvWMf6c/0Fb227+tunLbC75Js3NH8e6PBGRk9bXEbRGa21a9LYBGnruH9EuD3gVGAf8wFr7yHHO9x3gOwD5+fkXV1ZWnnZtIiLhcBt79z5NReXjhEKNZGVdQ1Hh90lOnhDr0kSORyNoApxEQDPGvAbkHOOhnwBP9g5kxpgGa+1R16H1enwk8ALwBWvtgRM9r6Y4RaS/hEItVH36BFVV/0U4HGDEiC9QVHgnbndhrEsTOZICmgB9H0HbCVxprfUbY3KBN621J/zT1BjzG2CltXbFidopoIlIf+vqaqCy8v/x6d4nsbaL3JwFFBbeQWLiyFiXJtJDAU2Avl8P9hLw9ejtrwMvHtnAGDPaGJMUvZ0OzAR29vF5RUROWVxcOuPG/ZAZ099k1Kiv4a9+gffen83Osv9FMFgT6/JERA7p6whaJvAskA9UAv9sra03xkwFFltrv22MmQv8ArB0/2XwsLX28c86t0bQRGSgdXTsp7ziEfz+FRjjZPTo/8aY/O8SH58R69Lk7KURNAG01ZOICG1tlZRXLKW6+kWcTjd5ed8gP+9bxMWlxro0OfsooAmggCYickhrYBfle/6TmtpVuFxexuQvYvTom3G5PLEurV9ZGyYUCiiADk4KaAIooImIHKWlZRu79/ySgwffIC4uk4KC7zFq5FdxOhNiXdopC3bW0dpaSqB1J62tO2kN7CQQ+IRIpIO0tEsZmbuA7Oz5OJ3uWJcq3RTQBFBAExE5rqamTeze80saGt4jISGHgoLbGJl7Aw5HfKxLO0o43EZrYNdhQay1dSddXfWH2sTFZZKcPIHk5Ak4nW4OHHiF9vZKnE4P2dnXMjL3Brzei+le1lJiRJ0vgAKaiMhnqm94nz17HtbW8J0AAAtbSURBVKCpaROJiXkUFd5BTs51GOM847VEIiHa2ysPBbBANIy1t39K93uxwOFIItlzDp5oGEv2jCc5eQLx8b7DzmWtpbFpI37/c9TUrCQcDpCUNIbc3AXk5nxJy4/EhgKaAApoIiInxVrLwfq32LPnAVpatuF2j6Wo8PtkZ8/HmP7fwc5aS2dnzWGjYYHWMgJtu4hEOqOtHLjdBSR7JkTD2HiSPRNISso/5ZpCoQC1tavZ73+OxsYPAENGxkxyc64nK+tqnM7Efv8a5ZgU0ARQQBMROSXWWmpr/8Ke8l8SCOwiOXkSRUV34cu86rSnBkOhVgKBsl5hrPt2KNR4qE18fPZho2Ge5Al43OMGJDi1t1fh9z+Pv/p5Ojr24XKlMGLEF8jNvYHUlPM1BTqw1LkCKKCJiJwWa8McOPAKe8ofpL29itTUCxhbtIT09BnHDTCRSBdtbeXdF+q37qQ1Gso6OvYeauN0uvF4xh8WxJI9E2KyNpu1ERoa1ndPgdauJhLpwOM5h9yc68nJ+RIJCVlnvKazgAKaAApoIiJ9Eol04a9+nvLyhwgG/aSlXcrYoiUkJo6itbWU1kDZoevEAoE9WNs9PWmMk6SkwkPTkj0X7ycmjh6QKdO+CoVaOHDgVfzVz9HUtAljnGRmXEFu7gJ8vqsG5RsnhigFNAEU0ERE+kUkEmTf/meoqHiUzs7awx5LSMgh2TP+0GhYcvIE3O6xQ3LZDoBAYA/+6ueo9v+ZYOcB4uLSGTHinxiZu4CUlCmxLm+oU0ATQAFNRKRfhcPt+Kv/jLXhaBgbT1xcWqzLGhDWhqmvf4f9/hXU1r6GtZ0kJ08iN3cBOSO+OGy3zLI2THt7FYHAbrzeC456d2wfKaAJoIAmIiL9oKurkeoDL+P3P0dLy8cYE4fPdxW5uQvIzLgCh8MV6xJPWTgcpK29nLbAJwQCuwm0fUJbYDeBtvJDU9XnnfsI2dnz+vNpFdAEUEATEZF+1tq6E7//OfzVL9DVdZD4eB85OdeRm3sDyZ5zYl3eUUKhliMC2G4CgU+ia8tFoq0MSUl5eNzjcHvG4nGPw+MZe2jR336kgCaAApqIiAyQSKSLgwffwu9fQd3BN7A2RGrK+eTm3sCIEf9IXJz3jNViraWz6+AxRsN2EwxWH2pnTDxudwEezzg87rGHwpjbXXim1oJTQBNAAU1ERM6Azs46qqtfwu9fQWtgJw5HPD7fXEbm3kBGxuX9tiuDtRE6Ovb/PYAFPomOiO0+bF05p9NzWADzeMbi8YwjMTEv1tOxCmgCKKCJiMgZZK2lpXUbfv8KqqtfJhRqJCEhh9ycL5GbuwC3u/CkzhOJdNHeXnnEaNgnBAJ7iETaD7WLi8s4ajTM4xlLQkLuYF1wd1AWJWeeApqIiMREJBKktm4dfv8KDh58G4jg9V7MyNwbyM6+FpcrmXC4nUDb7qNGw9rbK7A2dOhciQkjuwPYoTDW/XEIvpNUAU0ABTQRERkEgsEDVFe/wH7/c7S17cbhSCI+PpOOjn30bALfvbhvfvRC/e4A5vGMxe0ei8vlie0X0H8U0ARQQBMRkUHEWktz82b81X8mFGo+bDTM7S44G3YsUEATAIbewjQiIjJsGWPwei/E670w1qWIxNTg2/BNRERE5CyngCYiIiIyyCigiYiIiAwyCmgiIiIig4wCmoiIiMggo4AmIiIiMsgooImIiIgMMgpoIiIiIoOMApqIiIjIIKOAJiIiIjLIKKCJiIiIDDIKaCIiIiKDjAKaiIiIyCBjrLWxruGYjDG1QGU/n9YH1PXzOc826sO+Ux/2nfqw79SH/aO/+7HOWjuvH88nQ9SgDWgDwRiz0Vo7NdZ1DGXqw75TH/ad+rDv1If9Q/0oA0VTnCIiIiKDjAKaiIiIyCBztgW0x2NdwDCgPuw79WHfqQ/7Tn3YP9SPMiDOqmvQRERERIaCs20ETURERGTQOysCmjFmnjFmpzHmE2PMj2Jdz1BhjPmNMabGGLO117EMY8xaY8yu6Mf0WNY42Blj8owxbxhjthtjthljvh89rn48ScaYRGPMBmNMSbQP/3f0eKEx5oPo6/oZY0x8rGsd7IwxTmPMR8aYV6L31YenwBhTYYz52Biz2RizMXpMr2UZEMM+oBljnMAjwHxgMnCTMWZybKsaMn4LHLkez4+A16215wCvR+/L8YWAu621k4HLgNui33/qx5MXBK6y1hYDFwDzjDGXAT8DfmmtHQc0AN+KYY1DxfeBHb3uqw9P3Sxr7QW9ltbQa1kGxLAPaMA04BNr7R5rbSfwR+CLMa5pSLDWvg3UH3H4i8CT0dtPAted0aKGGGut31q7KXq7he5fjqNQP5402601ejcu+s8CVwErosfVh5/BGDMa+Afg19H7BvVhf9BrWQbE2RDQRgGf9rq/N3pMTs8Ia60/ersaGBHLYoYSY0wBcCHwAerHUxKdmtsM1ABrgd1Ao7U2FG2i1/VnexD4IRCJ3s9EfXiqLPAXY8zfjDHfiR7Ta1kGhCvWBcjQZa21xhi9DfgkGGOSgeeAf7HWNncPXnRTP342a20YuMAYkwb8GZgY45KGFGPMPwI11tq/GWOujHU9Q9hMa+0+Y0w2sNYYU9r7Qb2WpT+dDSNo+4C8XvdHR4/J6TlgjMkFiH6siXE9g54xJo7ucLbcWvt89LD68TRYaxuBN4DpQJoxpuePTL2uT+xy4J+MMRV0X+ZxFfCfqA9PibV2X/RjDd1/KExDr2UZIGdDQPsQOCf6bqV44CvASzGuaSh7Cfh69PbXgRdjWMugF73O57+AHdbaB3o9pH48ScaYrOjIGcaYJGAu3dfyvQHcEG2mPjwBa+2/WWtHW2sL6P4ZuM5auxD14UkzxniMMSk9t4Grga3otSwD5KxYqNYYcy3d1184gd9Ya++LcUlDgjHmD8CVgA84APxP4AXgWSAfqAT+2Vp75BsJJMoYMxP4K/Axf7/258d0X4emfjwJxpjz6b742kn3H5XPWmvvNcYU0T0alAF8BHzNWhuMXaVDQ3SK8x5r7T+qD09etK/+HL3rAn5vrb3PGJOJXssyAM6KgCYiIiIylJwNU5wiIiIiQ4oCmoiIiMggo4AmIiIiMsgooImIiIgMMgpoIiIiIoOMAprIMGCMaY1+LDDGfLWfz/3jI+6/15/nFxGRoymgiQwvBcApBbReK8kfz2EBzVo74xRrEhGRU6SAJjK83A98zhiz2RhzV3ST8Z8bYz40xmwxxnwXuhcrNcb81RjzErA9euyF6CbQ23o2gjbG3A8kRc+3PHqsZ7TORM+91RjzsTHmy73O/aYxZoUxptQYs9z03nxUREQ+kzZLFxlefkR0lXiAaNBqstZeYoxJAN41xvwl2vYi4FxrbXn0/jettfXR7ZQ+NMY8Z639kTHmdmvtBcd4ruuBC4Biuneb+NAY83b0sQuBKcB+4F2694J8p/+/XBGR4UkjaCLD29XAzcaYzXRvL5UJnBN9bEOvcAZwpzGmBFgP5PVqdzwzgT9Ya8PW2gPAW8Alvc6911obATbTPfUqIiInSSNoIsObAe6w1q457GD3foyBI+7PAaZba9uMMW8CiX143t77OYbRzxoRkVOiETSR4aUFSOl1fw3wPWNMHIAxZrwxxnOMz/MCDdFwNhG4rNdjXT2ff4S/Al+OXueWBXwe2NAvX4WIyFlOf9WKDC9bgHB0qvK3wH/SPb24KXqhfi1w3TE+bzWw2BizA9hJ9zRnj8eBLcaYTdbahb2O/xmYDpQAFvihtbY6GvBERKQPjLU21jWIiIiISC+a4hQREREZZBTQRERERAYZBTQRERGRQUYBTURERGSQUUATERERGWQU0EREREQGGQU0ERERkUFGAU1ERERkkPn/tzSwDsv/3JQAAAAASUVORK5CYII=\n", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "plot_collection(trial,\"average_shap\")" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_python3", + "language": "python", + "name": "conda_python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-debugger/xgboost_realtime_analysis/cm.gif b/sagemaker-debugger/xgboost_realtime_analysis/cm.gif new file mode 100644 index 0000000000..5304a0a040 Binary files /dev/null and b/sagemaker-debugger/xgboost_realtime_analysis/cm.gif differ diff --git a/sagemaker-debugger/xgboost_realtime_analysis/data_utils.py b/sagemaker-debugger/xgboost_realtime_analysis/data_utils.py new file mode 100644 index 0000000000..893498a285 --- /dev/null +++ b/sagemaker-debugger/xgboost_realtime_analysis/data_utils.py @@ -0,0 +1,47 @@ +import bz2 +import random +import tempfile +import urllib.request + +import boto3 + + +def load_mnist(train_split=0.8, seed=42): + + if not (0 < train_split <= 1): + raise ValueError("'train_split' must be between 0 and 1.") + + url = "https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass/mnist.bz2" + + with tempfile.NamedTemporaryFile(mode="wb", delete=False) as mnist_bz2: + urllib.request.urlretrieve(url, mnist_bz2.name) + + with bz2.open(mnist_bz2.name, "r") as fin: + content = fin.read().decode("utf-8") + lines = content.strip().split('\n') + n = sum(1 for line in lines) + indices = list(range(n)) + random.seed(seed) + random.shuffle(indices) + train_indices = set(indices[:int(n * 0.8)]) + + with tempfile.NamedTemporaryFile(mode='w', delete=False) as train_file: + with tempfile.NamedTemporaryFile(mode='w', delete=False) as valid_file: + for idx, line in enumerate(lines): + if idx in train_indices: + train_file.write(line + '\n') + else: + valid_file.write(line + '\n') + + return train_file.name, valid_file.name + + +def write_to_s3(fobj, bucket, key): + return boto3.Session().resource('s3').Bucket(bucket).Object(key).upload_fileobj(fobj) + + +def upload_to_s3(filename, bucket, key): + url = f"s3://{bucket}/{key}" + print(f"Writing to {url}") + with open(filename, "rb") as fobj: + write_to_s3(fobj, bucket, key) diff --git a/sagemaker-debugger/xgboost_realtime_analysis/xgboost-realtime-analysis.ipynb b/sagemaker-debugger/xgboost_realtime_analysis/xgboost-realtime-analysis.ipynb new file mode 100644 index 0000000000..fa0af13793 --- /dev/null +++ b/sagemaker-debugger/xgboost_realtime_analysis/xgboost-realtime-analysis.ipynb @@ -0,0 +1,452 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Debugging XGBoost training jobs in real time with Amazon SageMaker Debugger \n", + "\n", + "This notebook uses the MNIST dataset to demonstrate real-time analysis of XGBoost training jobs while the training jobs are running. \n", + "\n", + "This notebook was created and tested on an ml.m5.4xlarge notebook instance using 100GB instance volume. \n", + "\n", + "## Overview \n", + "\n", + "Amazon SageMaker Debugger is a new capability of Amazon SageMaker that allows debugging machine learning training. \n", + "SageMaker Debugger helps you to monitor your training in near real time using rules and provides alerts if it detects an inconsistency in training. \n", + "\n", + "Using SageMaker Debugger is a two step process: Saving tensors and analysis. \n", + "Let's look at each one of them closely. \n", + "\n", + "### Saving tensors\n", + "\n", + "In deep learning algorithms, tensors define the state of the training job at any particular instant in its lifecycle.\n", + "Amazon SageMaker Debugger exposes a library which allows you to capture these tensors and save them for analysis.\n", + "Although XGBoost is not a deep learning algorithm, Amazon SageMaker Debugger is highly customizable and can help you interpret results by saving insightful metrics. For example, performance metrics or the importance of features at different frequencies. \n", + "Refer to [documentation](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/xgboost.md) for details on how to save the metrics you want. \n", + "\n", + "\n", + "### Analysis\n", + "\n", + "There are two ways to get to tensors and run analysis on them.\n", + "\n", + "One way is to use concept called ***Rules***. On a very broad level, a rule is Python code used to detect certain conditions during training.\n", + "Some of the conditions that a data scientist training an algorithm may care about are monitoring for gradients getting too large or too small, detecting overfitting, and so on.\n", + "Amazon SageMaker Debugger comes pre-packaged with certain built-in rules that can be invoked on Amazon SageMaker. You can also write your own rules using the Amazon SageMaker Debugger APIs. \n", + "For more details about automatic analysis using rules, see [rules documentation](https://github.com/awslabs/sagemaker-debugger/tree/master/docs/rules).\n", + "\n", + "This notebook focuses on another approach: **Manual analysis**, which can be performed in realtime while training jobs are running.\n", + "\n", + "Manual analysis is helpful to detect which type of issue you're running into. You save raw tensors in order to understand your data and model better and figure out the root cause of your training job problem.\n", + "\n", + "Manual analysis is powered by the Amazon SageMaker Debugger API. This API framework enables retrieving tensors and scalars (e.g., debugging data) saved during training job via few lines of code. One of the most powerful features provided by it is realtime access to data. You can get tensors and scalars ***while your training job is running***.\n", + "\n", + "![Animated confusion matrix](cm.gif)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "! python -m pip install smdebug" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "import sagemaker" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Amazon SageMaker Debugger is available in Amazon SageMaker XGBoost container version `0.90-2` or later. If you want to use XGBoost with Amazon SageMaker Debugger, you have to specify `repo_version='0.90-2'` in the `get_image_uri` function." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.amazon.amazon_estimator import get_image_uri\n", + "\n", + "# Below changes the region to be one where this notebook is running\n", + "region = boto3.Session().region_name\n", + "container = get_image_uri(region, \"xgboost\", repo_version=\"0.90-2\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training XGBoost models in Amazon SageMaker with Amazon SageMaker Debugger\n", + "\n", + "In this section you learn to train an XGBoost model with Amazon SageMaker Debugger enabled and monitor the training jobs.\n", + "This is done using the SageMaker [Estimator API](https://sagemaker.readthedocs.io/en/stable/estimators.html#sagemaker.estimator.Estimator).\n", + "While training job is running use the SageMaker Debugger API to access saved tensors in real time and visualize them.\n", + "You can rely on SageMaker Debugger to take care of downloading fresh set of tensors every time we query for them.\n", + "\n", + "This example is adapted from [XGBoost for Classification](https://github.com/awslabs/amazon-sagemaker/examples/tree/master/introduction_to_amazon_algorithms/xgboost_mnist).\n", + "Refer to [XGBoost for Classification](https://github.com/awslabs/amazon-sagemaker/examples/tree/master/introduction_to_amazon_algorithms/xgboost_mnist) for an example of using classification from Amazon SageMaker's implementation of [XGBoost](https://github.com/dmlc/xgboost).\n", + "\n", + "### Data preparation\n", + "\n", + "Use the [MNIST data](https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass.html) stored in [LIBSVM](https://www.csie.ntu.edu.tw/~cjlin/libsvm/) format." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from data_utils import load_mnist, upload_to_s3\n", + "\n", + "bucket = sagemaker.Session().default_bucket()\n", + "prefix = \"DEMO-smdebug-xgboost-mnist\"" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%time\n", + "\n", + "train_file, validation_file = load_mnist()\n", + "upload_to_s3(train_file, bucket, f\"{prefix}/train/mnist.train.libsvm\")\n", + "upload_to_s3(validation_file, bucket, f\"{prefix}/validation/mnist.validation.libsvm\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Enabling Amazon SageMaker Debugger in the estimator object\n", + "\n", + "Enabling Amazon SageMaker Debugger in a training job can be accomplished by adding its configuration into an Estimator object constructor:\n", + "\n", + "```\n", + "from sagemaker.debugger import DebuggerHookConfig\n", + "\n", + "estimator = Estimator(\n", + " ...,\n", + " debugger_hook_config = DebuggerHookConfig(\n", + " s3_output_path=\"s3://{bucket_name}/{location_in_bucket}\", # Required\n", + " collection_configs=[\n", + " CollectionConfig(\n", + " name=\"metrics\",\n", + " parameters={\n", + " \"save_interval\": \"10\"\n", + " }\n", + " )\n", + " ]\n", + " )\n", + ")\n", + "```\n", + "Here, the `DebuggerHookConfig` object configures which data `Estimator` should save for the real-time visualization. Provide two parameters:\n", + "\n", + "- `s3_output_path`: Points to an S3 bucket where you intend to store the debugging tensors. Amount of data saved depends on multiple factors, major ones are training job, data set, model, frequency of saving tensors. This bucket should be in your AWS account and you should have full access control over it. **Important**: This S3 bucket should be originally created in the same Region where your training job is running, otherwise you might run into problems with cross-Region access.\n", + "\n", + "- `collection_configs`: It enumerates named collections of tensors to save. Collections are a convenient way to organize relevant tensors under same umbrella to make it easy to navigate them during analysis. In this particular example, you are interested in a single collection named `metrics`. You also configured Amazon SageMaker Debugger to save metrics every 10 iterations. For all parameters that are supported by Collections and DebuggerConfig, see [Collection documentation](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/api.md)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Using Amazon SageMaker Debugger with XGBoost Classification\n", + "\n", + "Import the libraries for the demo of Amazon SageMaker Debugger." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker import get_execution_role\n", + "\n", + "role = get_execution_role()\n", + "base_job_name = \"demo-smdebug-xgboost-classification\"\n", + "bucket_path = 's3://{}'.format(bucket)\n", + "\n", + "num_round = 25\n", + "save_interval = 3\n", + "hyperparameters={\n", + " \"max_depth\": \"5\",\n", + " \"eta\": \"0.1\",\n", + " \"gamma\": \"4\",\n", + " \"min_child_weight\": \"6\",\n", + " \"silent\": \"0\",\n", + " \"objective\": \"multi:softmax\",\n", + " \"num_class\": \"10\", # num_class is required for 'multi:*' objectives\n", + " \"num_round\": num_round,\n", + "}\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.estimator import Estimator\n", + "from sagemaker.debugger import DebuggerHookConfig, CollectionConfig\n", + "\n", + "xgboost_algorithm_mode_estimator = Estimator(\n", + " role=role,\n", + " base_job_name=base_job_name,\n", + " train_instance_count=1,\n", + " train_instance_type='ml.m5.xlarge',\n", + " image_name=container,\n", + " hyperparameters=hyperparameters,\n", + " train_max_run=1800,\n", + "\n", + " debugger_hook_config = DebuggerHookConfig(\n", + " s3_output_path=bucket_path, # Required\n", + " collection_configs=[\n", + " CollectionConfig(\n", + " name=\"metrics\",\n", + " parameters={\n", + " \"save_interval\": str(save_interval)\n", + " }\n", + " ),\n", + " CollectionConfig(\n", + " name=\"predictions\",\n", + " parameters={\n", + " \"save_interval\": str(save_interval)\n", + " }\n", + " ),\n", + " CollectionConfig(\n", + " name=\"labels\",\n", + " parameters={\n", + " \"save_interval\": str(save_interval)\n", + " }\n", + " )\n", + " ]\n", + " )\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "pycharm": { + "name": "#%% md\n" + } + }, + "source": [ + "With the next step you are going to actually start a training job using the Estimator object you created above. This job is started in asynchronous, non-blocking way. This means that control is passed back to the notebook and further commands can be run while training job is progressing." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.session import s3_input\n", + "\n", + "train_s3_input = s3_input(\"s3://{}/{}/{}\".format(bucket, prefix, \"train\"), content_type=\"libsvm\")\n", + "validation_s3_input = s3_input(\"s3://{}/{}/{}\".format(bucket, prefix, \"validation\"), content_type=\"libsvm\")\n", + "\n", + "# This is a fire and forget event. By setting wait=False, you just submit the job to run in the background.\n", + "# Amazon SageMaker will start one training job and release control to next cells in the notebook.\n", + "# Follow this notebook to see status of the training job.\n", + "xgboost_algorithm_mode_estimator.fit(\n", + " {\n", + " \"train\": train_s3_input,\n", + " \"validation\": validation_s3_input\n", + " },\n", + " wait=False\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Result\n", + "\n", + "As a result of the above command, Amazon SageMaker starts one training job for you and it produces the tensors to be analyzed.\n", + "This job will run in a background without you having to wait for it to complete in order to continue with the rest of the notebook.\n", + "Because of this asynchronous nature of a training job, you need to monitor its status so that you don't start to request debugging too early.\n", + "\n", + "\n", + "## Analysis and Visualization\n", + "\n", + "### Checking on the training job status\n", + "\n", + "Check the status of the training job by running the following code.\n", + "It checks on the status of an Amazon SageMaker training job every 15 seconds.\n", + "Once job has started its traning cycle control is released to next cells in the notebook.\n", + "That means training job started to tune the model and, in parallel, emit debugging tensors." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import time\n", + "from time import gmtime, strftime\n", + "\n", + "\n", + "# Below command will give the status of training job\n", + "job_name = xgboost_algorithm_mode_estimator.latest_training_job.name\n", + "client = xgboost_algorithm_mode_estimator.sagemaker_session.sagemaker_client\n", + "description = client.describe_training_job(TrainingJobName=job_name)\n", + "print('Training job name: ' + job_name)\n", + "\n", + "if description['TrainingJobStatus'] != 'Completed':\n", + " while description['SecondaryStatus'] not in ['Training', 'Completed']:\n", + " description = client.describe_training_job(TrainingJobName=job_name)\n", + " primary_status = description['TrainingJobStatus']\n", + " secondary_status = description['SecondaryStatus']\n", + " print(\"{}: {}, {}\".format(strftime('%X', gmtime()), primary_status, secondary_status))\n", + " time.sleep(15)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Retrieving and Analyzing tensors\n", + "\n", + "Before getting to analysis, here are some notes on concepts being used in Amazon SageMaker Debugger that help with analysis.\n", + "- ***Trial*** - Object that is a centerpiece of the SageMaker Debugger API when it comes to getting access to tensors. It is a top level abstract that represents a single run of a training job. All tensors emitted by a training job are associated with its *trial*.\n", + "- ***Step*** - Object that represents next level of abstraction. In SageMaker Debugger, *step* is a representation of a single batch of a training job. Each trial has multiple steps. Each tensor is associated with multiple steps and has a particular value at each of the steps.\n", + "- ***Tensor*** - object that represent actual *tensor* saved during training job. *Note* - it could be a scalar as well (for example, metrics are saved as scalars).\n", + "\n", + "For more details on aforementioned concepts as well as on SageMaker Debugger API in general (including examples) see [SageMaker Debugger Analysis API](https://github.com/awslabs/sagemaker-debugger/blob/master/docs/analysis.md) documentation.\n", + "\n", + "In the following code cell, use a ***Trial*** to access tensors. You can do that by inspecting currently running training job and extract necessary parameters from its debug configuration to instruct SageMaker Debugger where the data you are looking for is located. Keep in mind the following:\n", + "- Tensors are being stored in your own S3 bucket to which you can navigate and manually inspect its content if desired.\n", + "- You might notice a slight delay before trial object is created. This is normal as SageMaker Debugger monitors the corresponding bucket with tensors and waits until tensors appear in it. The delay is introduced by less than instantaneous upload of tensors from a training container to your S3 bucket. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from smdebug.trials import create_trial\n", + "\n", + "description = client.describe_training_job(TrainingJobName=job_name)\n", + "s3_output_path = xgboost_algorithm_mode_estimator.latest_job_debugger_artifacts_path()\n", + "\n", + "# This is where we create a Trial object that allows access to saved tensors.\n", + "trial = create_trial(s3_output_path)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import numpy as np\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "from sklearn.metrics import confusion_matrix\n", + "from IPython.display import display, clear_output\n", + "\n", + "\n", + "def plot_confusion_for_one_step(trial, step, ax=None):\n", + " if ax is None:\n", + " fig, ax = plt.subplots()\n", + " cm = confusion_matrix(\n", + " trial.tensor(\"labels\").value(step),\n", + " trial.tensor(\"predictions\").value(step)\n", + " )\n", + " normalized_cm = cm.astype('float') / cm.sum(axis=1)[:, np.newaxis]\n", + " sns.heatmap(normalized_cm, cmap=\"bone\", ax=ax, cbar=False, annot=cm, fmt='')\n", + " print(f\"iteration: {step}\")\n", + "\n", + "\n", + "def plot_and_update_confusion_for_all_steps(trial):\n", + "\n", + " fig, ax = plt.subplots()\n", + " rendered_steps = []\n", + " # trial.loaded_all_steps is a way to keep monitoring for a state of a training job\n", + " # as seen by Amazon SageMaker Debugger.\n", + " # When training job is completed Trial becomes aware of it.\n", + " while not rendered_steps or not trial.loaded_all_steps:\n", + " steps = trial.steps()\n", + " # quick way to get diff between two lists\n", + " steps_to_render = list(set(steps).symmetric_difference(set(rendered_steps)))\n", + " # plot only from newer chunk\n", + " for step in steps_to_render:\n", + " clear_output(wait=True)\n", + " plot_confusion_for_one_step(trial, step, ax=ax)\n", + " display(fig)\n", + " plt.pause(5)\n", + " ax.clear()\n", + " rendered_steps.extend(steps_to_render)\n", + " fig.clear()\n", + " plt.close()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Visualizing confusion matrix of a running training job\n", + "\n", + "Finally, wait until Amazon SageMaker Debugger has downloaded initial collection of tensors to look at. Once that collection is ready you keep getting new tensors every five seconds and plot their tensors correspondingly one under another." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "plot_and_update_confusion_for_all_steps(trial)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_python3", + "language": "python", + "name": "conda_python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + }, + "pycharm": { + "stem_cell": { + "cell_type": "raw", + "metadata": { + "collapsed": false + }, + "source": [] + } + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/sagemaker-experiments/mnist-handwritten-digits-classification-experiment.ipynb b/sagemaker-experiments/mnist-handwritten-digits-classification-experiment.ipynb new file mode 100644 index 0000000000..16b4358c56 --- /dev/null +++ b/sagemaker-experiments/mnist-handwritten-digits-classification-experiment.ipynb @@ -0,0 +1,482 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## MNIST Handwritten Digits Classification Experiment\n", + "\n", + "This demo shows how you can use SageMaker Experiment Management Python SDK to organize, track, compare, and evaluate your machine learning (ML) model training experiments.\n", + "\n", + "You can track artifacts for experiments, including data sets, algorithms, hyper-parameters, and metrics. Experiments executed on SageMaker such as SageMaker Autopilot jobs and training jobs are automatically tracked will be automatically tracked. You can also track artifacts for additional steps within an ML workflow that come before/after model training e.g. data pre-processing or post-training model evaluation.\n", + "\n", + "The APIs also let you search and browse your current and past experiments, compare experiments, and identify best performing models.\n", + "\n", + "Now we will demonstrate these capabilities through an MNIST handwritten digits classification example. The experiment will be organized as follow:\n", + "\n", + "1. Download and prepare the MNIST dataset.\n", + "2. Train a Convolutional Neural Network (CNN) Model. Tune the hyper parameter that configures the number of hidden channels in the model. Track the parameter configurations and resulting model accuracy using SageMaker Experiments Python SDK.\n", + "3. Finally use the search and analytics capabilities of Python SDK to search, compare and evaluate the performance of all model versions generated from model tuning in Step 2.\n", + "4. We will also see an example of tracing the complete linage of a model version i.e. the collection of all the data pre-processing and training configurations and inputs that went into creating that model version.\n", + "\n", + "Note this notebook can only be ran on sagemaker notebook instance with `conda_pytorch_p36` kernel." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Install Python SDKs" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sys" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!{sys.executable} -m pip install sagemaker-experiments" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Setup" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import time\n", + "\n", + "import boto3\n", + "import numpy as np\n", + "import pandas as pd\n", + "%config InlineBackend.figure_format = 'retina'\n", + "from matplotlib import pyplot as plt\n", + "from torchvision import datasets, transforms\n", + "\n", + "import sagemaker\n", + "from sagemaker.session import Session\n", + "from sagemaker.analytics import ExperimentAnalytics\n", + "\n", + "from smexperiments.experiment import Experiment\n", + "from smexperiments.trial import Trial\n", + "from smexperiments.trial_component import TrialComponent\n", + "from smexperiments.tracker import Tracker" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "sess = boto3.Session()\n", + "sm = sess.client('sagemaker')\n", + "role = sagemaker.get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Create a S3 bucket to hold data" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "# create a s3 bucket to hold data, note that your account might already created a bucket with the same name\n", + "account_id = sess.client('sts').get_caller_identity()[\"Account\"]\n", + "bucket = 'sagemaker-experiments-{}-{}'.format(sess.region_name, account_id)\n", + "prefix = 'mnist'\n", + "\n", + "try:\n", + " if sess.region_name == \"us-east-1\":\n", + " sess.client('s3').create_bucket(Bucket=bucket)\n", + " else:\n", + " sess.client('s3').create_bucket(Bucket=bucket, \n", + " CreateBucketConfiguration={'LocationConstraint': sess.region_name})\n", + "except Exception as e:\n", + " print(e)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Dataset\n", + "We download the MNIST hand written digits dataset, and then apply transformation on each of the image." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "# download the dataset\n", + "# this will not only download data to ./mnist folder, but also load and transform (normalize) them\n", + "train_set = datasets.MNIST('mnist', train=True, transform=transforms.Compose([\n", + " transforms.ToTensor(),\n", + " transforms.Normalize((0.1307,), (0.3081,))]), \n", + " download=True)\n", + " \n", + "test_set = datasets.MNIST('mnist', train=False, transform=transforms.Compose([\n", + " transforms.ToTensor(),\n", + " transforms.Normalize((0.1307,), (0.3081,))]),\n", + " download=False)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "plt.imshow(train_set.data[2].numpy())" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After transforming the images in the dataset, we upload it to s3." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "inputs = sagemaker.Session().upload_data(path='mnist', bucket=bucket, key_prefix=prefix)\n", + "print('input spec: {}'.format(inputs))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now lets track the parameters from the data pre-processing step." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "with Tracker.create(display_name=\"Preprocessing\", sagemaker_boto_client=sm) as tracker:\n", + " tracker.log_parameters({\n", + " \"normalization_mean\": 0.1307,\n", + " \"normalization_std\": 0.3081,\n", + " })\n", + " # we can log the s3 uri to the dataset we just uploaded\n", + " tracker.log_input(name=\"mnist-dataset\", media_type=\"s3/uri\", value=inputs)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Step 1 - Set up the Experiment\n", + "\n", + "Create an experiment to track all the model training iterations. Experiments are a great way to organize your data science work. You can create experiments to organize all your model development work for : [1] a business use case you are addressing (e.g. create experiment named “customer churn prediction”), or [2] a data science team that owns the experiment (e.g. create experiment named “marketing analytics experiment”), or [3] a specific data science and ML project. Think of it as a “folder” for organizing your “files”." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Create an Experiment" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": false + }, + "outputs": [], + "source": [ + "mnist_experiment = Experiment.create(\n", + " experiment_name=f\"mnist-hand-written-digits-classification-{int(time.time())}\", \n", + " description=\"Classification of mnist hand-written digits\", \n", + " sagemaker_boto_client=sm)\n", + "print(mnist_experiment)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Step 2 - Track Experiment\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Now create a Trial for each training run to track the it's inputs, parameters, and metrics." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "While training the CNN model on SageMaker, we will experiment with several values for the number of hidden channel in the model. We will create a Trial to track each training job run. We will also create a TrialComponent from the tracker we created before, and add to the Trial. This will enrich the Trial with the parameters we captured from the data pre-processing stage.\n", + "\n", + "Note the execution of the following code takes a while." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.pytorch import PyTorch" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hidden_channel_trial_name_map = {}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If you want to run the following training jobs asynchronously, you may need to increase your resource limit. Otherwise, you can run them sequentially." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "preprocessing_trial_component = tracker.trial_component" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "for i, num_hidden_channel in enumerate([2, 5, 10, 20, 32]):\n", + " # create trial\n", + " trial_name = f\"cnn-training-job-{num_hidden_channel}-hidden-channels-{int(time.time())}\"\n", + " cnn_trial = Trial.create(\n", + " trial_name=trial_name, \n", + " experiment_name=mnist_experiment.experiment_name,\n", + " sagemaker_boto_client=sm,\n", + " )\n", + " hidden_channel_trial_name_map[num_hidden_channel] = trial_name\n", + " \n", + " # associate the proprocessing trial component with the current trial\n", + " cnn_trial.add_trial_component(preprocessing_trial_component)\n", + " \n", + " # all input configurations, parameters, and metrics specified in estimator \n", + " # definition are automatically tracked\n", + " estimator = PyTorch(\n", + " entry_point='./mnist.py',\n", + " role=role,\n", + " sagemaker_session=sagemaker.Session(sagemaker_client=sm),\n", + " framework_version='1.1.0',\n", + " train_instance_count=1,\n", + " train_instance_type='ml.c4.xlarge',\n", + " hyperparameters={\n", + " 'epochs': 2,\n", + " 'backend': 'gloo',\n", + " 'hidden_channels': num_hidden_channel,\n", + " 'dropout': 0.2,\n", + " 'optimizer': 'sgd'\n", + " },\n", + " metric_definitions=[\n", + " {'Name':'train:loss', 'Regex':'Train Loss: (.*?);'},\n", + " {'Name':'test:loss', 'Regex':'Test Average loss: (.*?),'},\n", + " {'Name':'test:accuracy', 'Regex':'Test Accuracy: (.*?)%;'}\n", + " ],\n", + " enable_sagemaker_metrics=True,\n", + " )\n", + " \n", + " cnn_training_job_name = \"cnn-training-job-{}\".format(int(time.time()))\n", + " \n", + " # Now associate the estimator with the Experiment and Trial\n", + " estimator.fit(\n", + " inputs={'training': inputs}, \n", + " job_name=cnn_training_job_name,\n", + " experiment_config={\n", + " \"TrialName\": cnn_trial.trial_name,\n", + " \"TrialComponentDisplayName\": \"Training\",\n", + " },\n", + " wait=True,\n", + " )\n", + " \n", + " # give it a while before dispatching the next training job\n", + " time.sleep(2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Compare the model training runs for an experiment\n", + "\n", + "Now we will use the analytics capabilities of Python SDK to query and compare the training runs for identifying the best model produced by our experiment. You can retrieve trial components by using a search expression." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Some Simple Analyses" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "search_expression = {\n", + " \"Filters\":[\n", + " {\n", + " \"Name\": \"DisplayName\",\n", + " \"Operator\": \"Equals\",\n", + " \"Value\": \"Training\",\n", + " }\n", + " ],\n", + "}" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "trial_component_analytics = ExperimentAnalytics(\n", + " sagemaker_session=Session(sess, sm), \n", + " experiment_name=mnist_experiment.experiment_name,\n", + " search_expression=search_expression,\n", + " sort_by=\"metrics.test:accuracy.max\",\n", + " sort_order=\"Descending\",\n", + " metric_names=['test:accuracy'],\n", + " parameter_names=['hidden_channels', 'epochs', 'dropout', 'optimizer']\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "analytic_table = trial_component_analytics.dataframe()" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "analytic_table" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To isolate and measure the impact of change in hidden channels on model accuracy, we vary the number of hidden channel and fix the value for other hyperparameters.\n", + "\n", + "Next let's look at an example of tracing the lineage of a model by accessing the data tracked by SageMaker Experiments for `cnn-training-job-2-hidden-channels` trial" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "lineage_table = ExperimentAnalytics(\n", + " sagemaker_session=Session(sess, sm), \n", + " search_expression={\n", + " \"Filters\":[{\n", + " \"Name\": \"Parents.TrialName\",\n", + " \"Operator\": \"Equals\",\n", + " \"Value\": hidden_channel_trial_name_map[2]\n", + " }]\n", + " },\n", + " sort_by=\"CreationTime\",\n", + " sort_order=\"Ascending\",\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "lineage_table.dataframe()" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_pytorch_p36", + "language": "python", + "name": "conda_pytorch_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-experiments/mnist.py b/sagemaker-experiments/mnist.py new file mode 100644 index 0000000000..ee598a8eb8 --- /dev/null +++ b/sagemaker-experiments/mnist.py @@ -0,0 +1,222 @@ +import argparse +import json +import logging +import os +from os.path import join +import sagemaker_containers +import sys +import torch +import torch.distributed as dist +import torch.nn as nn +import torch.nn.functional as F +import torch.optim as optim +import torch.utils.data +import torch.utils.data.distributed +from torchvision import datasets, transforms + +import boto3 + +import time + +logger = logging.getLogger(__name__) +logger.setLevel(logging.DEBUG) +logger.addHandler(logging.StreamHandler(sys.stdout)) + +if 'SAGEMAKER_METRICS_DIRECTORY' in os.environ: + log_file_handler = logging.FileHandler(join(os.environ['SAGEMAKER_METRICS_DIRECTORY'], "metrics.json")) + log_file_handler.setFormatter( + "{'time':'%(asctime)s', 'name': '%(name)s', \ + 'level': '%(levelname)s', 'message': '%(message)s'}" + ) + logger.addHandler(log_file_handler) + +# Based on https://github.com/pytorch/examples/blob/master/mnist/main.py +class Net(nn.Module): + def __init__(self, hidden_channels, kernel_size=5, drop_out=.5): + super(Net, self).__init__() + self.conv1 = nn.Conv2d(1, hidden_channels, kernel_size=kernel_size) + self.conv2 = nn.Conv2d(hidden_channels, 20, kernel_size=kernel_size) + self.conv2_drop = nn.Dropout2d(p=drop_out) + self.fc1 = nn.Linear(320, 50) + self.fc2 = nn.Linear(50, 10) + + def forward(self, x): + x = F.relu(F.max_pool2d(self.conv1(x), 2)) + x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2)) + x = x.view(-1, 320) + x = F.relu(self.fc1(x)) + x = F.dropout(x, training=self.training) + x = self.fc2(x) + return F.log_softmax(x, dim=1) + + +def _get_train_data_loader(batch_size, training_dir, is_distributed, **kwargs): + logger.info("Get train data loader") + dataset = datasets.MNIST(training_dir, train=True, transform=transforms.Compose([ + transforms.ToTensor(), + transforms.Normalize((0.1307,), (0.3081,)) + ]), download=False) + train_sampler = torch.utils.data.distributed.DistributedSampler(dataset) if is_distributed else None + return torch.utils.data.DataLoader(dataset, batch_size=batch_size, shuffle=train_sampler is None, + sampler=train_sampler, **kwargs) + + +def _get_test_data_loader(test_batch_size, training_dir, **kwargs): + logger.info("Get test data loader") + return torch.utils.data.DataLoader( + datasets.MNIST(training_dir, train=False, transform=transforms.Compose([ + transforms.ToTensor(), + transforms.Normalize((0.1307,), (0.3081,)) + ]), download=False), + batch_size=test_batch_size, shuffle=True, **kwargs) + + +def _average_gradients(model): + # Gradient averaging. + size = float(dist.get_world_size()) + for param in model.parameters(): + dist.all_reduce(param.grad.data, op=dist.reduce_op.SUM) + param.grad.data /= size + + +def train(args, tracker=None): + is_distributed = len(args.hosts) > 1 and args.backend is not None + logger.debug("Distributed training - {}".format(is_distributed)) + use_cuda = args.num_gpus > 0 + logger.debug("Number of gpus available - {}".format(args.num_gpus)) + kwargs = {'num_workers': 1, 'pin_memory': True} if use_cuda else {} + device = torch.device("cuda" if use_cuda else "cpu") + + if is_distributed: + # Initialize the distributed environment. + world_size = len(args.hosts) + os.environ['WORLD_SIZE'] = str(world_size) + host_rank = args.hosts.index(args.current_host) + os.environ['RANK'] = str(host_rank) + dist.init_process_group(backend=args.backend, rank=host_rank, world_size=world_size) + logger.info('Initialized the distributed environment: \'{}\' backend on {} nodes. '.format( + args.backend, dist.get_world_size()) + 'Current host rank is {}. Number of gpus: {}'.format( + dist.get_rank(), args.num_gpus)) + + # set the seed for generating random numbers + torch.manual_seed(args.seed) + if use_cuda: + torch.cuda.manual_seed(args.seed) + + train_loader = _get_train_data_loader(args.batch_size, args.data_dir, is_distributed, **kwargs) + test_loader = _get_test_data_loader(args.test_batch_size, args.data_dir, **kwargs) + + logger.info("Processes {}/{} ({:.0f}%) of train data".format( + len(train_loader.sampler), len(train_loader.dataset), + 100. * len(train_loader.sampler) / len(train_loader.dataset) + )) + + logger.info("Processes {}/{} ({:.0f}%) of test data".format( + len(test_loader.sampler), len(test_loader.dataset), + 100. * len(test_loader.sampler) / len(test_loader.dataset) + )) + + model = Net(args.hidden_channels, args.kernel_size, args.dropout).to(device) + if is_distributed and use_cuda: + # multi-machine multi-gpu case + model = torch.nn.parallel.DistributedDataParallel(model) + else: + # single-machine multi-gpu case or single-machine or multi-machine cpu case + model = torch.nn.DataParallel(model) + + if args.optimizer == 'sgd': + optimizer = optim.SGD(model.parameters(), lr=args.lr, momentum=args.momentum) + else: + optimizer = optim.Adam(model.parameters(), lr=args.lr) + + for epoch in range(1, args.epochs + 1): + model.train() + for batch_idx, (data, target) in enumerate(train_loader, 1): + data, target = data.to(device), target.to(device) + optimizer.zero_grad() + output = model(data) + loss = F.nll_loss(output, target) + loss.backward() + if is_distributed and not use_cuda: + # average gradients manually for multi-machine cpu case only + _average_gradients(model) + optimizer.step() + if batch_idx % args.log_interval == 0: + logger.info('Train Epoch: {} [{}/{} ({:.0f}%)], Train Loss: {:.6f};'.format( + epoch, batch_idx * len(data), len(train_loader.sampler), + 100. * batch_idx / len(train_loader), loss.item())) + test(model, test_loader, device, tracker) + save_model(model, args.model_dir) + + +def test(model, test_loader, device, tracker=None): + model.eval() + test_loss = 0 + correct = 0 + with torch.no_grad(): + for data, target in test_loader: + data, target = data.to(device), target.to(device) + output = model(data) + test_loss += F.nll_loss(output, target, size_average=False).item() # sum up batch loss + pred = output.max(1, keepdim=True)[1] # get the index of the max log-probability + correct += pred.eq(target.view_as(pred)).sum().item() + + test_loss /= len(test_loader.dataset) + logger.info('Test Average loss: {:.4f}, Test Accuracy: {:.0f}%;\n'.format( + test_loss, 100. * correct / len(test_loader.dataset))) + + +def model_fn(model_dir): + device = torch.device("cuda" if torch.cuda.is_available() else "cpu") + model = torch.nn.DataParallel(Net()) + with open(os.path.join(model_dir, 'model.pth'), 'rb') as f: + model.load_state_dict(torch.load(f)) + return model.to(device) + +def save_model(model, model_dir): + logger.info("Saving the model.") + path = os.path.join(model_dir, 'model.pth') + # recommended way from http://pytorch.org/docs/master/notes/serialization.html + torch.save(model.cpu().state_dict(), path) + + +if __name__ == '__main__': + parser = argparse.ArgumentParser() + + # Data and model checkpoints directories + parser.add_argument('--batch-size', type=int, default=64, metavar='N', + help='input batch size for training (default: 64)') + parser.add_argument('--test-batch-size', type=int, default=1000, metavar='N', + help='input batch size for testing (default: 1000)') + parser.add_argument('--epochs', type=int, default=10, metavar='N', + help='number of epochs to train (default: 10)') + parser.add_argument('--optimizer', type=str, default="sgd", + help='optimizer for training.') + parser.add_argument('--lr', type=float, default=0.01, metavar='LR', + help='learning rate (default: 0.01)') + parser.add_argument('--dropout', type=float, default=0.5, metavar='DROP', + help='dropout rate (default: 0.5)') + parser.add_argument('--kernel_size', type=int, default=5, metavar='KERNEL', + help='conv2d filter kernel size (default: 5)') + parser.add_argument('--momentum', type=float, default=0.5, metavar='M', + help='SGD momentum (default: 0.5)') + parser.add_argument('--hidden_channels', type=int, default=10, + help='number of channels in hidden conv layer') + parser.add_argument('--seed', type=int, default=1, metavar='S', + help='random seed (default: 1)') + parser.add_argument('--log-interval', type=int, default=100, metavar='N', + help='how many batches to wait before logging training status') + parser.add_argument('--backend', type=str, default=None, + help='backend for distributed training (tcp, gloo on cpu and gloo, nccl on gpu)') + + + # Container environment + parser.add_argument('--hosts', type=list, default=json.loads(os.environ['SM_HOSTS'])) + parser.add_argument('--current-host', type=str, default=os.environ['SM_CURRENT_HOST']) + parser.add_argument('--model-dir', type=str, default=os.environ['SM_MODEL_DIR']) + parser.add_argument('--data-dir', type=str, default=os.environ['SM_CHANNEL_TRAINING']) + parser.add_argument('--num-gpus', type=int, default=os.environ['SM_NUM_GPUS']) + + args = parser.parse_args() + + train(args) \ No newline at end of file diff --git a/sagemaker-python-sdk/dgl_gcmc/GCMC.Dockerfile b/sagemaker-python-sdk/dgl_gcmc/GCMC.Dockerfile new file mode 100644 index 0000000000..311b7568f2 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcmc/GCMC.Dockerfile @@ -0,0 +1,5 @@ +From 763104351884.dkr.ecr.us-east-2.amazonaws.com/mxnet-training:1.6.0-gpu-py36-ubuntu16.04 + +RUN pip install gluonnlp pandas +RUN pip install spacy +RUN python3 -m spacy download en \ No newline at end of file diff --git a/sagemaker-python-sdk/dgl_gcmc/README.md b/sagemaker-python-sdk/dgl_gcmc/README.md new file mode 100644 index 0000000000..39ef74e118 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcmc/README.md @@ -0,0 +1,6 @@ +# Example Recommender System Using Deep Graph Library with a Graph Convolutional Matrix Completion Network +This example is a recommender system for movie reviews. It uses a graph convolutional matrix completion (GCMC) network trained on the MovieLens datasets. These datasets consist of movie titles, genres, and ratings by users. + +It uses [Apache MXNet](https://mxnet.apache.org/) as its backend. [MXNet built with large tensor support](https://mxnet.apache.org/api/faq/large_tensor_support) is required. This example shows you how you can build your own container with the necessary dependencies to run the network on Amazon SageMaker. + +For more information about Deep Graph Library (DGL) please visit the DGL documentation website: https://docs.dgl.ai diff --git a/sagemaker-python-sdk/dgl_gcmc/data.py b/sagemaker-python-sdk/dgl_gcmc/data.py new file mode 100644 index 0000000000..dc32c74727 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcmc/data.py @@ -0,0 +1,529 @@ +"""MovieLens dataset""" +import numpy as np +import os +import re +import pandas as pd +import scipy.sparse as sp +import gluonnlp as nlp +import mxnet as mx + +import dgl +from dgl.data.utils import download, extract_archive, get_download_dir + +_urls = { + 'ml-100k' : 'http://files.grouplens.org/datasets/movielens/ml-100k.zip', + 'ml-1m' : 'http://files.grouplens.org/datasets/movielens/ml-1m.zip', + 'ml-10m' : 'http://files.grouplens.org/datasets/movielens/ml-10m.zip', +} + +READ_DATASET_PATH = get_download_dir() +GENRES_ML_100K =\ + ['unknown', 'Action', 'Adventure', 'Animation', + 'Children', 'Comedy', 'Crime', 'Documentary', 'Drama', 'Fantasy', + 'Film-Noir', 'Horror', 'Musical', 'Mystery', 'Romance', 'Sci-Fi', + 'Thriller', 'War', 'Western'] +GENRES_ML_1M = GENRES_ML_100K[1:] +GENRES_ML_10M = GENRES_ML_100K + ['IMAX'] + +class MovieLens(object): + """MovieLens dataset used by GCMC model + + TODO(minjie): make this dataset more general + + The dataset stores MovieLens ratings in two types of graphs. The encoder graph + contains rating value information in the form of edge types. The decoder graph + stores plain user-movie pairs in the form of a bipartite graph with no rating + information. All graphs have two types of nodes: "user" and "movie". + + The training, validation and test set can be summarized as follows: + + training_enc_graph : training user-movie pairs + rating info + training_dec_graph : training user-movie pairs + valid_enc_graph : training user-movie pairs + rating info + valid_dec_graph : validation user-movie pairs + test_enc_graph : training user-movie pairs + validation user-movie pairs + rating info + test_dec_graph : test user-movie pairs + + Attributes + ---------- + train_enc_graph : dgl.DGLHeteroGraph + Encoder graph for training. + train_dec_graph : dgl.DGLHeteroGraph + Decoder graph for training. + train_labels : mx.nd.NDArray + The categorical label of each user-movie pair + train_truths : mx.nd.NDArray + The actual rating values of each user-movie pair + valid_enc_graph : dgl.DGLHeteroGraph + Encoder graph for validation. + valid_dec_graph : dgl.DGLHeteroGraph + Decoder graph for validation. + valid_labels : mx.nd.NDArray + The categorical label of each user-movie pair + valid_truths : mx.nd.NDArray + The actual rating values of each user-movie pair + test_enc_graph : dgl.DGLHeteroGraph + Encoder graph for test. + test_dec_graph : dgl.DGLHeteroGraph + Decoder graph for test. + test_labels : mx.nd.NDArray + The categorical label of each user-movie pair + test_truths : mx.nd.NDArray + The actual rating values of each user-movie pair + user_feature : mx.nd.NDArray + User feature tensor. If None, representing an identity matrix. + movie_feature : mx.nd.NDArray + Movie feature tensor. If None, representing an identity matrix. + possible_rating_values : np.ndarray + Available rating values in the dataset + + Parameters + ---------- + name : str + Dataset name. Could be "ml-100k", "ml-1m", "ml-10m" + ctx : mx.context.Context + Device context + use_one_hot_fea : bool, optional + If true, the ``user_feature`` attribute is None, representing an one-hot identity + matrix. (Default: False) + symm : bool, optional + If true, the use symmetric normalize constant. Otherwise, use left normalize + constant. (Default: True) + test_ratio : float, optional + Ratio of test data + valid_ratio : float, optional + Ratio of validation data + + """ + def __init__(self, name, ctx, use_one_hot_fea=False, symm=True, + test_ratio=0.1, valid_ratio=0.1): + self._name = name + self._ctx = ctx + self._symm = symm + self._test_ratio = test_ratio + self._valid_ratio = valid_ratio + # download and extract + download_dir = get_download_dir() + zip_file_path = '{}/{}.zip'.format(download_dir, name) + download(_urls[name], path=zip_file_path) + extract_archive(zip_file_path, '{}/{}'.format(download_dir, name)) + if name == 'ml-10m': + root_folder = 'ml-10M100K' + else: + root_folder = name + self._dir = os.path.join(download_dir, name, root_folder) + print("Starting processing {} ...".format(self._name)) + self._load_raw_user_info() + self._load_raw_movie_info() + print('......') + if self._name == 'ml-100k': + self.all_train_rating_info = self._load_raw_rates(os.path.join(self._dir, 'u1.base'), '\t') + self.test_rating_info = self._load_raw_rates(os.path.join(self._dir, 'u1.test'), '\t') + self.all_rating_info = pd.concat([self.all_train_rating_info, self.test_rating_info]) + elif self._name == 'ml-1m' or self._name == 'ml-10m': + self.all_rating_info = self._load_raw_rates(os.path.join(self._dir, 'ratings.dat'), '::') + num_test = int(np.ceil(self.all_rating_info.shape[0] * self._test_ratio)) + shuffled_idx = np.random.permutation(self.all_rating_info.shape[0]) + self.test_rating_info = self.all_rating_info.iloc[shuffled_idx[: num_test]] + self.all_train_rating_info = self.all_rating_info.iloc[shuffled_idx[num_test: ]] + else: + raise NotImplementedError + print('......') + num_valid = int(np.ceil(self.all_train_rating_info.shape[0] * self._valid_ratio)) + shuffled_idx = np.random.permutation(self.all_train_rating_info.shape[0]) + self.valid_rating_info = self.all_train_rating_info.iloc[shuffled_idx[: num_valid]] + self.train_rating_info = self.all_train_rating_info.iloc[shuffled_idx[num_valid: ]] + self.possible_rating_values = np.unique(self.train_rating_info["rating"].values) + + print("All rating pairs : {}".format(self.all_rating_info.shape[0])) + print("\tAll train rating pairs : {}".format(self.all_train_rating_info.shape[0])) + print("\t\tTrain rating pairs : {}".format(self.train_rating_info.shape[0])) + print("\t\tValid rating pairs : {}".format(self.valid_rating_info.shape[0])) + print("\tTest rating pairs : {}".format(self.test_rating_info.shape[0])) + + self.user_info = self._drop_unseen_nodes(orign_info=self.user_info, + cmp_col_name="id", + reserved_ids_set=set(self.all_rating_info["user_id"].values), + label="user") + self.movie_info = self._drop_unseen_nodes(orign_info=self.movie_info, + cmp_col_name="id", + reserved_ids_set=set(self.all_rating_info["movie_id"].values), + label="movie") + + # Map user/movie to the global id + self.global_user_id_map = {ele: i for i, ele in enumerate(self.user_info['id'])} + self.global_movie_id_map = {ele: i for i, ele in enumerate(self.movie_info['id'])} + print('Total user number = {}, movie number = {}'.format(len(self.global_user_id_map), + len(self.global_movie_id_map))) + self._num_user = len(self.global_user_id_map) + self._num_movie = len(self.global_movie_id_map) + + ### Generate features + if use_one_hot_fea: + self.user_feature = None + self.movie_feature = None + else: + self.user_feature = mx.nd.array(self._process_user_fea(), ctx=ctx, dtype=np.float32) + self.movie_feature = mx.nd.array(self._process_movie_fea(), ctx=ctx, dtype=np.float32) + if self.user_feature is None: + self.user_feature_shape = (self.num_user, self.num_user) + self.movie_feature_shape = (self.num_movie, self.num_movie) + else: + self.user_feature_shape = self.user_feature.shape + self.movie_feature_shape = self.movie_feature.shape + info_line = "Feature dim: " + info_line += "\nuser: {}".format(self.user_feature_shape) + info_line += "\nmovie: {}".format(self.movie_feature_shape) + print(info_line) + + all_train_rating_pairs, all_train_rating_values = self._generate_pair_value(self.all_train_rating_info) + train_rating_pairs, train_rating_values = self._generate_pair_value(self.train_rating_info) + valid_rating_pairs, valid_rating_values = self._generate_pair_value(self.valid_rating_info) + test_rating_pairs, test_rating_values = self._generate_pair_value(self.test_rating_info) + + def _make_labels(ratings): + labels = mx.nd.array(np.searchsorted(self.possible_rating_values, ratings), + ctx=ctx, dtype=np.int32) + return labels + + self.train_enc_graph = self._generate_enc_graph(train_rating_pairs, train_rating_values, add_support=True) + self.train_dec_graph = self._generate_dec_graph(train_rating_pairs) + self.train_labels = _make_labels(train_rating_values) + self.train_truths = mx.nd.array(train_rating_values, ctx=ctx, dtype=np.float32) + + self.valid_enc_graph = self.train_enc_graph + self.valid_dec_graph = self._generate_dec_graph(valid_rating_pairs) + self.valid_labels = _make_labels(valid_rating_values) + self.valid_truths = mx.nd.array(valid_rating_values, ctx=ctx, dtype=np.float32) + + self.test_enc_graph = self._generate_enc_graph(all_train_rating_pairs, all_train_rating_values, add_support=True) + self.test_dec_graph = self._generate_dec_graph(test_rating_pairs) + self.test_labels = _make_labels(test_rating_values) + self.test_truths = mx.nd.array(test_rating_values, ctx=ctx, dtype=np.float32) + + def _npairs(graph): + rst = 0 + for r in self.possible_rating_values: + rst += graph.number_of_edges(str(r)) + return rst + + print("Train enc graph: \t#user:{}\t#movie:{}\t#pairs:{}".format( + self.train_enc_graph.number_of_nodes('user'), self.train_enc_graph.number_of_nodes('movie'), + _npairs(self.train_enc_graph))) + print("Train dec graph: \t#user:{}\t#movie:{}\t#pairs:{}".format( + self.train_dec_graph.number_of_nodes('user'), self.train_dec_graph.number_of_nodes('movie'), + self.train_dec_graph.number_of_edges())) + print("Valid enc graph: \t#user:{}\t#movie:{}\t#pairs:{}".format( + self.valid_enc_graph.number_of_nodes('user'), self.valid_enc_graph.number_of_nodes('movie'), + _npairs(self.valid_enc_graph))) + print("Valid dec graph: \t#user:{}\t#movie:{}\t#pairs:{}".format( + self.valid_dec_graph.number_of_nodes('user'), self.valid_dec_graph.number_of_nodes('movie'), + self.valid_dec_graph.number_of_edges())) + print("Test enc graph: \t#user:{}\t#movie:{}\t#pairs:{}".format( + self.test_enc_graph.number_of_nodes('user'), self.test_enc_graph.number_of_nodes('movie'), + _npairs(self.test_enc_graph))) + print("Test dec graph: \t#user:{}\t#movie:{}\t#pairs:{}".format( + self.test_dec_graph.number_of_nodes('user'), self.test_dec_graph.number_of_nodes('movie'), + self.test_dec_graph.number_of_edges())) + + def _generate_pair_value(self, rating_info): + rating_pairs = (np.array([self.global_user_id_map[ele] for ele in rating_info["user_id"]], + dtype=np.int64), + np.array([self.global_movie_id_map[ele] for ele in rating_info["movie_id"]], + dtype=np.int64)) + rating_values = rating_info["rating"].values.astype(np.float32) + return rating_pairs, rating_values + + def _generate_enc_graph(self, rating_pairs, rating_values, add_support=False): + user_movie_R = np.zeros((self._num_user, self._num_movie), dtype=np.float32) + user_movie_R[rating_pairs] = rating_values + movie_user_R = user_movie_R.transpose() + + rating_graphs = [] + rating_row, rating_col = rating_pairs + for rating in self.possible_rating_values: + ridx = np.where(rating_values == rating) + rrow = rating_row[ridx] + rcol = rating_col[ridx] + bg = dgl.bipartite((rrow, rcol), 'user', str(rating), 'movie', + card=(self._num_user, self._num_movie)) + rev_bg = dgl.bipartite((rcol, rrow), 'movie', 'rev-%s' % str(rating), 'user', + card=(self._num_movie, self._num_user)) + rating_graphs.append(bg) + rating_graphs.append(rev_bg) + graph = dgl.hetero_from_relations(rating_graphs) + + # sanity check + assert len(rating_pairs[0]) == sum([graph.number_of_edges(et) for et in graph.etypes]) // 2 + + if add_support: + def _calc_norm(x): + x = x.asnumpy().astype('float32') + x[x == 0.] = np.inf + x = mx.nd.array(1. / np.sqrt(x)) + return x.as_in_context(self._ctx).expand_dims(1) + user_ci = [] + user_cj = [] + movie_ci = [] + movie_cj = [] + for r in self.possible_rating_values: + r = str(r) + user_ci.append(graph['rev-%s' % r].in_degrees()) + movie_ci.append(graph[r].in_degrees()) + if self._symm: + user_cj.append(graph[r].out_degrees()) + movie_cj.append(graph['rev-%s' % r].out_degrees()) + else: + user_cj.append(mx.nd.zeros((self.num_user,))) + movie_cj.append(mx.nd.zeros((self.num_movie,))) + user_ci = _calc_norm(mx.nd.add_n(*user_ci)) + movie_ci = _calc_norm(mx.nd.add_n(*movie_ci)) + if self._symm: + user_cj = _calc_norm(mx.nd.add_n(*user_cj)) + movie_cj = _calc_norm(mx.nd.add_n(*movie_cj)) + else: + user_cj = mx.nd.ones((self.num_user,), ctx=self._ctx) + movie_cj = mx.nd.ones((self.num_movie,), ctx=self._ctx) + graph.nodes['user'].data.update({'ci' : user_ci, 'cj' : user_cj}) + graph.nodes['movie'].data.update({'ci' : movie_ci, 'cj' : movie_cj}) + + return graph + + def _generate_dec_graph(self, rating_pairs): + ones = np.ones_like(rating_pairs[0]) + user_movie_ratings_coo = sp.coo_matrix( + (ones, rating_pairs), + shape=(self.num_user, self.num_movie), dtype=np.float32) + return dgl.bipartite(user_movie_ratings_coo, 'user', 'rate', 'movie') + + @property + def num_links(self): + return self.possible_rating_values.size + + @property + def num_user(self): + return self._num_user + + @property + def num_movie(self): + return self._num_movie + + def _drop_unseen_nodes(self, orign_info, cmp_col_name, reserved_ids_set, label): + # print(" -----------------") + # print("{}: {}(reserved) v.s. {}(from info)".format(label, len(reserved_ids_set), + # len(set(orign_info[cmp_col_name].values)))) + if reserved_ids_set != set(orign_info[cmp_col_name].values): + pd_rating_ids = pd.DataFrame(list(reserved_ids_set), columns=["id_graph"]) + # print("\torign_info: ({}, {})".format(orign_info.shape[0], orign_info.shape[1])) + data_info = orign_info.merge(pd_rating_ids, left_on=cmp_col_name, right_on='id_graph', how='outer') + data_info = data_info.dropna(subset=[cmp_col_name, 'id_graph']) + data_info = data_info.drop(columns=["id_graph"]) + data_info = data_info.reset_index(drop=True) + # print("\tAfter dropping, data shape: ({}, {})".format(data_info.shape[0], data_info.shape[1])) + return data_info + else: + orign_info = orign_info.reset_index(drop=True) + return orign_info + + def _load_raw_rates(self, file_path, sep): + """In MovieLens, the rates have the following format + + ml-100k + user id \t movie id \t rating \t timestamp + + ml-1m/10m + UserID::MovieID::Rating::Timestamp + + timestamp is unix timestamp and can be converted by pd.to_datetime(X, unit='s') + + Parameters + ---------- + file_path : str + + Returns + ------- + rating_info : pd.DataFrame + """ + rating_info = pd.read_csv( + file_path, sep=sep, header=None, + names=['user_id', 'movie_id', 'rating', 'timestamp'], + dtype={'user_id': np.int32, 'movie_id' : np.int32, + 'ratings': np.float32, 'timestamp': np.int64}, engine='python') + return rating_info + + def _load_raw_user_info(self): + """In MovieLens, the user attributes file have the following formats: + + ml-100k: + user id | age | gender | occupation | zip code + + ml-1m: + UserID::Gender::Age::Occupation::Zip-code + + For ml-10m, there is no user information. We read the user id from the rating file. + + Parameters + ---------- + name : str + + Returns + ------- + user_info : pd.DataFrame + """ + if self._name == 'ml-100k': + self.user_info = pd.read_csv(os.path.join(self._dir, 'u.user'), sep='|', header=None, + names=['id', 'age', 'gender', 'occupation', 'zip_code'], engine='python') + elif self._name == 'ml-1m': + self.user_info = pd.read_csv(os.path.join(self._dir, 'users.dat'), sep='::', header=None, + names=['id', 'gender', 'age', 'occupation', 'zip_code'], engine='python') + elif self._name == 'ml-10m': + rating_info = pd.read_csv( + os.path.join(self._dir, 'ratings.dat'), sep='::', header=None, + names=['user_id', 'movie_id', 'rating', 'timestamp'], + dtype={'user_id': np.int32, 'movie_id': np.int32, 'ratings': np.float32, + 'timestamp': np.int64}, engine='python') + self.user_info = pd.DataFrame(np.unique(rating_info['user_id'].values.astype(np.int32)), + columns=['id']) + else: + raise NotImplementedError + + def _process_user_fea(self): + """ + + Parameters + ---------- + user_info : pd.DataFrame + name : str + For ml-100k and ml-1m, the column name is ['id', 'gender', 'age', 'occupation', 'zip_code']. + We take the age, gender, and the one-hot encoding of the occupation as the user features. + For ml-10m, there is no user feature and we set the feature to be a single zero. + + Returns + ------- + user_features : np.ndarray + + """ + if self._name == 'ml-100k' or self._name == 'ml-1m': + ages = self.user_info['age'].values.astype(np.float32) + gender = (self.user_info['gender'] == 'F').values.astype(np.float32) + all_occupations = set(self.user_info['occupation']) + occupation_map = {ele: i for i, ele in enumerate(all_occupations)} + occupation_one_hot = np.zeros(shape=(self.user_info.shape[0], len(all_occupations)), + dtype=np.float32) + occupation_one_hot[np.arange(self.user_info.shape[0]), + np.array([occupation_map[ele] for ele in self.user_info['occupation']])] = 1 + user_features = np.concatenate([ages.reshape((self.user_info.shape[0], 1)) / 50.0, + gender.reshape((self.user_info.shape[0], 1)), + occupation_one_hot], axis=1) + elif self._name == 'ml-10m': + user_features = np.zeros(shape=(self.user_info.shape[0], 1), dtype=np.float32) + else: + raise NotImplementedError + return user_features + + def _load_raw_movie_info(self): + """In MovieLens, the movie attributes may have the following formats: + + In ml_100k: + + movie id | movie title | release date | video release date | IMDb URL | [genres] + + In ml_1m, ml_10m: + + MovieID::Title (Release Year)::Genres + + Also, Genres are separated by |, e.g., Adventure|Animation|Children|Comedy|Fantasy + + Parameters + ---------- + name : str + + Returns + ------- + movie_info : pd.DataFrame + For ml-100k, the column name is ['id', 'title', 'release_date', 'video_release_date', 'url'] + [GENRES (19)]] + For ml-1m and ml-10m, the column name is ['id', 'title'] + [GENRES (18/20)]] + """ + if self._name == 'ml-100k': + GENRES = GENRES_ML_100K + elif self._name == 'ml-1m': + GENRES = GENRES_ML_1M + elif self._name == 'ml-10m': + GENRES = GENRES_ML_10M + else: + raise NotImplementedError + + if self._name == 'ml-100k': + file_path = os.path.join(self._dir, 'u.item') + self.movie_info = pd.read_csv(file_path, sep='|', header=None, + names=['id', 'title', 'release_date', 'video_release_date', 'url'] + GENRES, + engine='python') + elif self._name == 'ml-1m' or self._name == 'ml-10m': + file_path = os.path.join(self._dir, 'movies.dat') + movie_info = pd.read_csv(file_path, sep='::', header=None, + names=['id', 'title', 'genres'], engine='python') + genre_map = {ele: i for i, ele in enumerate(GENRES)} + genre_map['Children\'s'] = genre_map['Children'] + genre_map['Childrens'] = genre_map['Children'] + movie_genres = np.zeros(shape=(movie_info.shape[0], len(GENRES)), dtype=np.float32) + for i, genres in enumerate(movie_info['genres']): + for ele in genres.split('|'): + if ele in genre_map: + movie_genres[i, genre_map[ele]] = 1.0 + else: + print('genres not found, filled with unknown: {}'.format(genres)) + movie_genres[i, genre_map['unknown']] = 1.0 + for idx, genre_name in enumerate(GENRES): + assert idx == genre_map[genre_name] + movie_info[genre_name] = movie_genres[:, idx] + self.movie_info = movie_info.drop(columns=["genres"]) + else: + raise NotImplementedError + + def _process_movie_fea(self): + """ + + Parameters + ---------- + movie_info : pd.DataFrame + name : str + + Returns + ------- + movie_features : np.ndarray + Generate movie features by concatenating embedding and the year + + """ + if self._name == 'ml-100k': + GENRES = GENRES_ML_100K + elif self._name == 'ml-1m': + GENRES = GENRES_ML_1M + elif self._name == 'ml-10m': + GENRES = GENRES_ML_10M + else: + raise NotImplementedError + + word_embedding = nlp.embedding.GloVe('glove.840B.300d') + tokenizer = nlp.data.transforms.SpacyTokenizer() + + title_embedding = np.zeros(shape=(self.movie_info.shape[0], 300), dtype=np.float32) + release_years = np.zeros(shape=(self.movie_info.shape[0], 1), dtype=np.float32) + p = re.compile(r'(.+)\s*\((\d+)\)') + for i, title in enumerate(self.movie_info['title']): + match_res = p.match(title) + if match_res is None: + print('{} cannot be matched, index={}, name={}'.format(title, i, self._name)) + title_context, year = title, 1950 + else: + title_context, year = match_res.groups() + # We use average of glove + title_embedding[i, :] = word_embedding[tokenizer(title_context)].asnumpy().mean(axis=0) + release_years[i] = float(year) + movie_features = np.concatenate((title_embedding, + (release_years - 1950.0) / 100.0, + self.movie_info[GENRES]), + axis=1) + return movie_features + +if __name__ == '__main__': + MovieLens("ml-100k", ctx=mx.cpu(), symm=True) diff --git a/sagemaker-python-sdk/dgl_gcmc/model.py b/sagemaker-python-sdk/dgl_gcmc/model.py new file mode 100644 index 0000000000..75171b36ee --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcmc/model.py @@ -0,0 +1,233 @@ +"""NN modules""" +import math + +import numpy as np +import mxnet as mx +import mxnet.ndarray as F +from mxnet.gluon import nn, Block +import dgl.function as fn + +from utils import get_activation + +class GCMCLayer(Block): + r"""GCMC layer + + .. math:: + z_j^{(l+1)} = \sigma_{agg}\left[\mathrm{agg}\left( + \sum_{j\in\mathcal{N}_1}\frac{1}{c_{ij}}W_1h_j, \ldots, + \sum_{j\in\mathcal{N}_R}\frac{1}{c_{ij}}W_Rh_j + \right)\right] + + After that, apply an extra output projection: + + .. math:: + h_j^{(l+1)} = \sigma_{out}W_oz_j^{(l+1)} + + The equation is applied to both user nodes and movie nodes and the parameters + are not shared unless ``share_user_item_param`` is true. + + Parameters + ---------- + rating_vals : list of int or float + Possible rating values. + user_in_units : int + Size of user input feature + movie_in_units : int + Size of movie input feature + msg_units : int + Size of message :math:`W_rh_j` + out_units : int + Size of of final output user and movie features + dropout_rate : float, optional + Dropout rate (Default: 0.0) + agg : str, optional + Function to aggregate messages of different ratings. + Could be any of the supported cross type reducers: + "sum", "max", "min", "mean", "stack". + (Default: "stack") + agg_act : callable, str, optional + Activation function :math:`sigma_{agg}`. (Default: None) + out_act : callable, str, optional + Activation function :math:`sigma_{agg}`. (Default: None) + share_user_item_param : bool, optional + If true, user node and movie node share the same set of parameters. + Require ``user_in_units`` and ``move_in_units`` to be the same. + (Default: False) + """ + def __init__(self, + rating_vals, + user_in_units, + movie_in_units, + msg_units, + out_units, + dropout_rate=0.0, + agg='stack', # or 'sum' + agg_act=None, + out_act=None, + share_user_item_param=False): + super(GCMCLayer, self).__init__() + self.rating_vals = rating_vals + self.agg = agg + self.share_user_item_param = share_user_item_param + if agg == 'stack': + # divide the original msg unit size by number of ratings to keep + # the dimensionality + assert msg_units % len(rating_vals) == 0 + msg_units = msg_units // len(rating_vals) + with self.name_scope(): + self.dropout = nn.Dropout(dropout_rate) + self.W_r = {} + for rating in rating_vals: + rating = str(rating) + if share_user_item_param and user_in_units == movie_in_units: + self.W_r[rating] = self.params.get( + 'W_r_%s' % rating, shape=(user_in_units, msg_units), + dtype=np.float32, allow_deferred_init=True) + self.W_r['rev-%s' % rating] = self.W_r[rating] + else: + self.W_r[rating] = self.params.get( + 'W_r_%s' % rating, shape=(user_in_units, msg_units), + dtype=np.float32, allow_deferred_init=True) + self.W_r['rev-%s' % rating] = self.params.get( + 'revW_r_%s' % rating, shape=(movie_in_units, msg_units), + dtype=np.float32, allow_deferred_init=True) + self.ufc = nn.Dense(out_units) + if share_user_item_param: + self.ifc = self.ufc + else: + self.ifc = nn.Dense(out_units) + self.agg_act = get_activation(agg_act) + self.out_act = get_activation(out_act) + + def forward(self, graph, ufeat=None, ifeat=None): + """Forward function + + Normalizer constant :math:`c_{ij}` is stored as two node data "ci" + and "cj". + + Parameters + ---------- + graph : DGLHeteroGraph + User-movie rating graph. It should contain two node types: "user" + and "movie" and many edge types each for one rating value. + ufeat : mx.nd.NDArray, optional + User features. If None, using an identity matrix. + ifeat : mx.nd.NDArray, optional + Movie features. If None, using an identity matrix. + + Returns + ------- + new_ufeat : mx.nd.NDArray + New user features + new_ifeat : mx.nd.NDArray + New movie features + """ + num_u = graph.number_of_nodes('user') + num_i = graph.number_of_nodes('movie') + funcs = {} + for i, rating in enumerate(self.rating_vals): + rating = str(rating) + # W_r * x + x_u = dot_or_identity(ufeat, self.W_r[rating].data()) + x_i = dot_or_identity(ifeat, self.W_r['rev-%s' % rating].data()) + # left norm and dropout + x_u = x_u * self.dropout(graph.nodes['user'].data['cj']) + x_i = x_i * self.dropout(graph.nodes['movie'].data['cj']) + graph.nodes['user'].data['h%d' % i] = x_u + graph.nodes['movie'].data['h%d' % i] = x_i + funcs[rating] = (fn.copy_u('h%d' % i, 'm'), fn.sum('m', 'h')) + funcs['rev-%s' % rating] = (fn.copy_u('h%d' % i, 'm'), fn.sum('m', 'h')) + # message passing + graph.multi_update_all(funcs, self.agg) + ufeat = graph.nodes['user'].data.pop('h').reshape((num_u, -1)) + ifeat = graph.nodes['movie'].data.pop('h').reshape((num_i, -1)) + # right norm + ufeat = ufeat * graph.nodes['user'].data['ci'] + ifeat = ifeat * graph.nodes['movie'].data['ci'] + # fc and non-linear + ufeat = self.agg_act(ufeat) + ifeat = self.agg_act(ifeat) + ufeat = self.dropout(ufeat) + ifeat = self.dropout(ifeat) + ufeat = self.ufc(ufeat) + ifeat = self.ifc(ifeat) + return self.out_act(ufeat), self.out_act(ifeat) + +class BiDecoder(Block): + r"""Bilinear decoder. + + .. math:: + p(M_{ij}=r) = \text{softmax}(u_i^TQ_rv_j) + + The trainable parameter :math:`Q_r` is further decomposed to a linear + combination of basis weight matrices :math:`P_s`: + + .. math:: + Q_r = \sum_{s=1}^{b} a_{rs}P_s + + Parameters + ---------- + rating_vals : list of int or float + Possible rating values. + in_units : int + Size of input user and movie features + num_basis_functions : int, optional + Number of basis. (Default: 2) + dropout_rate : float, optional + Dropout raite (Default: 0.0) + """ + def __init__(self, + rating_vals, + in_units, + num_basis_functions=2, + dropout_rate=0.0): + super(BiDecoder, self).__init__() + self.rating_vals = rating_vals + self._num_basis_functions = num_basis_functions + self.dropout = nn.Dropout(dropout_rate) + self.Ps = [] + with self.name_scope(): + for i in range(num_basis_functions): + self.Ps.append(self.params.get( + 'Ps_%d' % i, shape=(in_units, in_units), + #init=mx.initializer.Orthogonal(scale=1.1, rand_type='normal'), + init=mx.initializer.Xavier(magnitude=math.sqrt(2.0)), + allow_deferred_init=True)) + self.rate_out = nn.Dense(units=len(rating_vals), flatten=False, use_bias=False) + + def forward(self, graph, ufeat, ifeat): + """Forward function. + + Parameters + ---------- + graph : DGLHeteroGraph + "Flattened" user-movie graph with only one edge type. + ufeat : mx.nd.NDArray + User embeddings. Shape: (|V_u|, D) + ifeat : mx.nd.NDArray + Movie embeddings. Shape: (|V_m|, D) + + Returns + ------- + mx.nd.NDArray + Predicting scores for each user-movie edge. + """ + graph = graph.local_var() + ufeat = self.dropout(ufeat) + ifeat = self.dropout(ifeat) + graph.nodes['movie'].data['h'] = ifeat + basis_out = [] + for i in range(self._num_basis_functions): + graph.nodes['user'].data['h'] = F.dot(ufeat, self.Ps[i].data()) + graph.apply_edges(fn.u_dot_v('h', 'h', 'sr')) + basis_out.append(graph.edata['sr'].expand_dims(1)) + out = F.concat(*basis_out, dim=1) + out = self.rate_out(out) + return out + +def dot_or_identity(A, B): + # if A is None, treat as identity matrix + if A is None: + return B + else: + return mx.nd.dot(A, B) diff --git a/sagemaker-python-sdk/dgl_gcmc/mxnet_gcmc.ipynb b/sagemaker-python-sdk/dgl_gcmc/mxnet_gcmc.ipynb new file mode 100644 index 0000000000..226e8ccc85 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcmc/mxnet_gcmc.ipynb @@ -0,0 +1,207 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training Graph Convolutional Matrix Completion by using the Deep Graph Library with MXNet backend on Amazon SageMaker\n", + "The **Amazon SageMaker Python SDK** makes it easy to train Deep Graph Library (DGL) models. In this example, you train [Graph Convolutional Matrix Completion](https://arxiv.org/abs/1706.02263) network using the [DMLC DGL API](https://github.com/dmlc/dgl.git) and the [MovieLens dataset](https://grouplens.org/datasets/movielens/). Three datasets are supported:\n", + " * MovieLens 100K Dataset, MovieLens 100K movie ratings. Stable benchmark dataset. 100,000 ratings from 1,000 users on 1,700 movies.\n", + " * MovieLens 1M Dataset, MovieLens 1M movie ratings. Stable benchmark dataset. 1 million ratings from 6,000 users on 4,000 movies.\n", + " * MovieLens 10M Dataset, MovieLens 10M movie ratings. Stable benchmark dataset. 10 million ratings and 100,000 tag applications applied to 10,000 movies by 72,000 users.\n", + "\n", + "### Prerequisites\n", + "To get started, install necessary packages." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!conda install -y boto3\n", + "!conda install -c anaconda -y botocore" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# Location where results of model training are saved.\n", + "model_artifacts_location = 's3://{}/artifacts'.format(bucket)\n", + "\n", + "# IAM role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# You can use the Amazon SageMaker Python SDK to get the role from your notebook environment. \n", + "role = get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### The training script\n", + "The train.py script provides all the code you need for training an Amazon SageMaker model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!cat train.py" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Build GCMC Docker image\n", + "AWS provides basic Docker images in https://docs.aws.amazon.com/dlami/latest/devguide/deep-learning-containers-images.html. For both PyTorch 1.3 and MXNet 1.6, DGL is preinstalled. As this example needs additional dependencies, you can download a Docker file to build a new image. You should build a GCMC-specific Docker image and push it into your Amazon Elastic Container Registry (Amazon ECR).\n", + "\n", + "Note: Do change the GCMC.Dockerfile with the latest MXNet GPU deep learning containers images name with py3." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%sh\n", + "account=$(aws sts get-caller-identity --query Account --output text)\n", + "echo $account\n", + "region=$(aws configure get region)\n", + "\n", + "docker_name=sagemaker-dgl-gcmc\n", + "\n", + "$(aws ecr get-login --no-include-email --region ${region} --registry-ids 763104351884)\n", + "docker build -t $docker_name -f GCMC.Dockerfile .\n", + "\n", + "# Get the login command from ECR and execute it directly\n", + "$(aws ecr get-login --region ${region} --no-include-email)\n", + "\n", + "fullname=\"${account}.dkr.ecr.${region}.amazonaws.com/${docker_name}:latest\"\n", + "# If the repository doesn't exist in ECR, create it.\n", + "aws ecr describe-repositories --repository-names \"${docker_name}\" > /dev/null 2>&1\n", + "if [ $? -ne 0 ]\n", + "then\n", + " aws ecr create-repository --repository-name \"${docker_name}\" > /dev/null\n", + "fi\n", + "\n", + "docker tag ${docker_name} ${fullname}\n", + "\n", + "docker push ${fullname}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Amazon SageMaker's estimator class\n", + "With the Amazon SageMaker Estimator, you can run a single machine in Amazon SageMaker, using CPU or GPU-based instances.\n", + "\n", + "When you create the estimator, pass-in the file name of the training script and the name of the IAM execution role. You can also use a few other parameters. train_instance_count and train_instance_type determine the number and type of Amazon SageMaker instances that will be used for the training job. The hyperparameters parameter is a dictionary of values that is passed to your training script as parameters so that you can use argparse to parse them. You can see how to access these values in the train.py script above.\n", + "\n", + "In this example, you upload the whole code base (including train.py) into an Amazon SageMaker container and run the GCMC training using the MovieLens dataset." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.mxnet.estimator import MXNet\n", + "\n", + "# Set target dgl-docker name\n", + "docker_name='sagemaker-dgl-gcmc'\n", + "\n", + "CODE_PATH = '../dgl_gcmc'\n", + "CODE_ENTRY = 'train.py'\n", + "#code_location = sess.upload_data(CODE_PATH, bucket=bucket, key_prefix=custom_code_upload_location)\n", + "\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:latest'.format(account, region, docker_name)\n", + "print(image)\n", + "\n", + "params = {}\n", + "params['data_name'] = 'ml-1m'\n", + "# set output to SageMaker ML output\n", + "params['save_dir'] = '/opt/ml/model/'\n", + "estimator = MXNet(entry_point=CODE_ENTRY,\n", + " source_dir=CODE_PATH,\n", + " role=role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.p3.2xlarge',\n", + " image_name=image,\n", + " hyperparameters=params,\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Running the Training Job\n", + "After you construct the Estimator object, fit it using Amazon SageMaker. The dataset is automatically downloaded." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator.fit()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Output\n", + "You can get the model training output from the Amazon Sagemaker console by searching for the training task and looking for the address of 'S3 model artifact'" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_mxnet_p36", + "language": "python", + "name": "conda_mxnet_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_gcmc/mxnet_gcmc_hypertune.ipynb b/sagemaker-python-sdk/dgl_gcmc/mxnet_gcmc_hypertune.ipynb new file mode 100644 index 0000000000..f2c864c2d3 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcmc/mxnet_gcmc_hypertune.ipynb @@ -0,0 +1,285 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Graph convolutional matrix completion hyperparameter tuning with Amazon SageMaker and Deep Graph Library with MXNet backend\n", + "_**Creating a hyperparameter tuning job for a DGL network**_\n", + "___\n", + "___\n", + "\n", + "\n", + "## Contents\n", + "1. [Background](#Background) \n", + "2. [Setup](#Setup) \n", + "3. [Code](#Code) \n", + "4. [Tune](#Train) \n", + "5. [Wrap-up](#Wrap-up) \n", + "\n", + "## Background\n", + "This example notebook focuses on how to create a graph neural network (GNN) model to train [Graph Convolutional Matrix Completion (GCMC)](https://arxiv.org/abs/1706.02263) network using DGL with mxnet backend with the [MovieLens dataset](https://grouplens.org/datasets/movielens/). It leverages SageMaker's hyperparameter tuning to kick off multiple training jobs with different hyperparameter combinations, to find the set with best model performance. This is an important step in the machine learning process as hyperparameter settings can have a large impact on model accuracy. In this example, you use the [SageMaker Python SDK](https://github.com/aws/sagemaker-python-sdk) to create a hyperparameter tuning job for an Amazomn SageMaker estimator." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup\n", + "This notebook is tested on an ml.p3.2xlarge notebook instance.\n", + "\n", + "Prerequisites\n", + " * You should be able to successfully run the GCMC example. You have your \\{account\\}.dkr.ecr.\\{region\\}.amazonaws.com/sagemaker-dgl-gcmc:latest under your Amazon Elastic Container Registry (Amazon ECR) with a specific account and Region.\n", + " * You have an S3 bucket and prefix that you want to use for training and model data. This exists within the same Region as the notebook instance, training, and hosting.\n", + " * You have established the IAM role Amazon Resource Name (ARN) used to give training and hosting access to your data. See the documentation for more details on creating these. Note, if a role not associated with the current notebook instance, or more than one role is required for training and hosting, please replace sagemaker.get_execution_role() with a the appropriate full IAM role ARN string." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# Location where results of model training are saved.\n", + "model_artifacts_location = 's3://{}/artifacts'.format(bucket)\n", + "\n", + "# IAM role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# You can use the Amazon SageMaker Python SDK to get the role from a notebook environment. \n", + "role = sagemaker.get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now import the Python libraries." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "from sagemaker.tuner import IntegerParameter, CategoricalParameter, ContinuousParameter, HyperparameterTuner" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Code\n", + "To use Amazon SageMaker to run Docker containers, we need to provide an python script for the container to run. In this example, mxnet_gcn.py provides all the code we need for training a SageMaker model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!cat train.py" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After you specify and tested the training script to ensure it works, start the tuning job. Testing can be done in either local mode or using SageMaker training. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tune\n", + "Similar to training a single training job in SageMaker, define your training estimator passing in the code scripts, IAM role, (per job) hardware configuration, and any hyperparameters we're not tuning.\n", + "\n", + "We assume you have already got your own GCMC Docker image in your ECR following the steps in mxnet_gcmc.ipynb." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.mxnet.estimator import MXNet\n", + "\n", + "# Set target dgl-docker name\n", + "docker_name='sagemaker-dgl-gcmc'\n", + "\n", + "CODE_PATH = '../dgl_gcmc'\n", + "CODE_ENTRY = 'train.py'\n", + "#code_location = sess.upload_data(CODE_PATH, bucket=bucket, key_prefix=custom_code_upload_location)\n", + "\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:latest'.format(account, region, docker_name)\n", + "print(image)\n", + "\n", + "params = {}\n", + "params['data_name'] = 'ml-1m'\n", + "# set output to Amazon SageMaker ML output\n", + "params['save_dir'] = '/opt/ml/model/'\n", + "estimator = MXNet(entry_point=CODE_ENTRY,\n", + " source_dir=CODE_PATH,\n", + " role=role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.p3.2xlarge',\n", + " image_name=image,\n", + " hyperparameters=params,\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After you define your estimator, specify the hyperparameters you want to tune and their possible values. You have three different types of hyperparameters.\n", + " * Categorical parameters need to take one value from a discrete set. You define this by passing the list of possible values to CategoricalParameter(list)\n", + " * Continuous parameters can take any real number value between the minimum and maximum value, defined by ContinuousParameter(min, max)\n", + " * Integer parameters can take any integer value between the minimum and maximum value, defined by IntegerParameter(min, max)\n", + " \n", + "If possible, it's almost always best to specify a value as the least restrictive type. For example, tuning thresh as a continuous value between 0.01 and 0.2 is likely to yield a better result than tuning as a categorical parameter with possible values of 0.01, 0.1, 0.15, or 0.2." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hyperparameter_ranges = {'gcn_agg_accum': CategoricalParameter(['sum', 'stack']),\n", + " 'train_lr': ContinuousParameter(0.001, 0.1),\n", + " 'gen_r_num_basis_func': IntegerParameter(1, 3)}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, specify the objective metric to tune and its definition. This includes the regular expression (Regex) needed to extract that metric from the CloudWatch logs of our training job." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "objective_metric_name = 'Validation-accuracy'\n", + "metric_definitions = [{'Name': 'Validation-accuracy',\n", + " 'Regex': 'Best Iter Idx=[0-9\\\\.]+, Best Valid RMSE=[0-9\\\\.]+, Best Test RMSE=([0-9\\\\.]+)'}]\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, create a HyperparameterTuner object, which you pass:\n", + "\n", + " * The training estimator created above\n", + " * Your hyperparameter ranges\n", + " * Objective metric name and definition\n", + " * Number of training jobs to run in total and how many training jobs should be run simultaneously. More parallel jobs will finish tuning sooner, but may sacrifice accuracy. We recommend you set the parallel jobs value to less than 10% of the total number of training jobs (we'll set it higher just for this example to keep it short).\n", + " * Whether you should maximize or minimize the objective metric. You haven't specified here since it defaults to 'Maximize', which is what you want for validation accuracy." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tuner = HyperparameterTuner(estimator,\n", + " objective_metric_name,\n", + " hyperparameter_ranges,\n", + " metric_definitions,\n", + " objective_type='Minimize',\n", + " max_jobs=10,\n", + " max_parallel_jobs=2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And finally, you can start the tuning job by calling .fit()." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tuner.fit()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's just run a quick check of the hyperparameter tuning jobs status to make sure it started successfully and is InProgress." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "boto3.client('sagemaker').describe_hyper_parameter_tuning_job(\n", + " HyperParameterTuningJobName=tuner.latest_tuning_job.job_name)['HyperParameterTuningJobStatus']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Wrap-up\n", + "Now that you started the hyperparameter tuning job, it runs in the background and you can close this notebook. Once finished, you can go to console to analyze the result.\n", + "\n", + "For more detail on Amazon SageMaker's Hyperparameter Tuning, please refer to the AWS documentation." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_dgl_py36_mxnet1.5", + "language": "python", + "name": "conda_dgl_py36_mxnet1.5" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_gcmc/train.py b/sagemaker-python-sdk/dgl_gcmc/train.py new file mode 100644 index 0000000000..441efa8144 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcmc/train.py @@ -0,0 +1,245 @@ +"""Training script""" +import os, time +import argparse +import logging +import random +import string +import json +import numpy as np +import mxnet as mx +from mxnet import gluon +from data import MovieLens +from model import GCMCLayer, BiDecoder +from utils import get_activation, parse_ctx, gluon_net_info, gluon_total_param_num, \ + params_clip_global_norm, MetricLogger +from mxnet.gluon import Block + +class Net(Block): + def __init__(self, args, **kwargs): + super(Net, self).__init__(**kwargs) + self._act = get_activation(args.model_activation) + with self.name_scope(): + self.encoder = GCMCLayer(args.rating_vals, + args.src_in_units, + args.dst_in_units, + args.gcn_agg_units, + args.gcn_out_units, + args.gcn_dropout, + args.gcn_agg_accum, + agg_act=self._act, + share_user_item_param=args.share_param) + self.decoder = BiDecoder(args.rating_vals, + in_units=args.gcn_out_units, + num_basis_functions=args.gen_r_num_basis_func) + + def forward(self, enc_graph, dec_graph, ufeat, ifeat): + user_out, movie_out = self.encoder( + enc_graph, + ufeat, + ifeat) + pred_ratings = self.decoder(dec_graph, user_out, movie_out) + return pred_ratings + +def evaluate(args, net, dataset, segment='valid'): + possible_rating_values = dataset.possible_rating_values + nd_possible_rating_values = mx.nd.array(possible_rating_values, ctx=args.ctx, dtype=np.float32) + + if segment == "valid": + rating_values = dataset.valid_truths + enc_graph = dataset.valid_enc_graph + dec_graph = dataset.valid_dec_graph + elif segment == "test": + rating_values = dataset.test_truths + enc_graph = dataset.test_enc_graph + dec_graph = dataset.test_dec_graph + else: + raise NotImplementedError + + # Evaluate RMSE + with mx.autograd.predict_mode(): + pred_ratings = net(enc_graph, dec_graph, + dataset.user_feature, dataset.movie_feature) + real_pred_ratings = (mx.nd.softmax(pred_ratings, axis=1) * + nd_possible_rating_values.reshape((1, -1))).sum(axis=1) + rmse = mx.nd.square(real_pred_ratings - rating_values).mean().asscalar() + rmse = np.sqrt(rmse) + return rmse + +def train(args): + print(args) + dataset = MovieLens(args.data_name, args.ctx, use_one_hot_fea=args.use_one_hot_fea, symm=args.gcn_agg_norm_symm, + test_ratio=args.data_test_ratio, valid_ratio=args.data_valid_ratio) + print("Loading data finished ...\n") + + args.src_in_units = dataset.user_feature_shape[1] + args.dst_in_units = dataset.movie_feature_shape[1] + args.rating_vals = dataset.possible_rating_values + + ### build the net + net = Net(args=args) + net.initialize(init=mx.init.Xavier(factor_type='in'), ctx=args.ctx) + net.hybridize() + nd_possible_rating_values = mx.nd.array(dataset.possible_rating_values, ctx=args.ctx, dtype=np.float32) + rating_loss_net = gluon.loss.SoftmaxCELoss() + rating_loss_net.hybridize() + trainer = gluon.Trainer(net.collect_params(), args.train_optimizer, {'learning_rate': args.train_lr}) + print("Loading network finished ...\n") + + ### perpare training data + train_gt_labels = dataset.train_labels + train_gt_ratings = dataset.train_truths + + ### prepare the logger + train_loss_logger = MetricLogger(['iter', 'loss', 'rmse'], ['%d', '%.4f', '%.4f'], + os.path.join(args.save_dir, 'train_loss%d.csv' % args.save_id)) + valid_loss_logger = MetricLogger(['iter', 'rmse'], ['%d', '%.4f'], + os.path.join(args.save_dir, 'valid_loss%d.csv' % args.save_id)) + test_loss_logger = MetricLogger(['iter', 'rmse'], ['%d', '%.4f'], + os.path.join(args.save_dir, 'test_loss%d.csv' % args.save_id)) + + ### declare the loss information + best_valid_rmse = np.inf + no_better_valid = 0 + best_iter = -1 + avg_gnorm = 0 + count_rmse = 0 + count_num = 0 + count_loss = 0 + + print("Start training ...") + dur = [] + for iter_idx in range(1, args.train_max_iter): + if iter_idx > 3: + t0 = time.time() + with mx.autograd.record(): + pred_ratings = net(dataset.train_enc_graph, dataset.train_dec_graph, + dataset.user_feature, dataset.movie_feature) + loss = rating_loss_net(pred_ratings, train_gt_labels).mean() + loss.backward() + + count_loss += loss.asscalar() + gnorm = params_clip_global_norm(net.collect_params(), args.train_grad_clip, args.ctx) + avg_gnorm += gnorm + trainer.step(1.0) + if iter_idx > 3: + dur.append(time.time() - t0) + + if iter_idx == 1: + print("Total #Param of net: %d" % (gluon_total_param_num(net))) + print(gluon_net_info(net, save_path=os.path.join(args.save_dir, 'net%d.txt' % args.save_id))) + + real_pred_ratings = (mx.nd.softmax(pred_ratings, axis=1) * + nd_possible_rating_values.reshape((1, -1))).sum(axis=1) + rmse = mx.nd.square(real_pred_ratings - train_gt_ratings).sum() + count_rmse += rmse.asscalar() + count_num += pred_ratings.shape[0] + + if iter_idx % args.train_log_interval == 0: + train_loss_logger.log(iter=iter_idx, + loss=count_loss/(iter_idx+1), rmse=count_rmse/count_num) + logging_str = "Iter={}, gnorm={:.3f}, loss={:.4f}, rmse={:.4f}, time={:.4f}".format( + iter_idx, avg_gnorm/args.train_log_interval, + count_loss/iter_idx, count_rmse/count_num, + np.average(dur)) + avg_gnorm = 0 + count_rmse = 0 + count_num = 0 + + if iter_idx % args.train_valid_interval == 0: + valid_rmse = evaluate(args=args, net=net, dataset=dataset, segment='valid') + valid_loss_logger.log(iter = iter_idx, rmse = valid_rmse) + logging_str += ',\tVal RMSE={:.4f}'.format(valid_rmse) + + if valid_rmse < best_valid_rmse: + best_valid_rmse = valid_rmse + no_better_valid = 0 + best_iter = iter_idx + net.save_parameters(filename=os.path.join(args.save_dir, 'best_valid_net{}.params'.format(args.save_id))) + test_rmse = evaluate(args=args, net=net, dataset=dataset, segment='test') + best_test_rmse = test_rmse + test_loss_logger.log(iter=iter_idx, rmse=test_rmse) + logging_str += ', Test RMSE={:.4f}'.format(test_rmse) + else: + no_better_valid += 1 + if no_better_valid > args.train_early_stopping_patience\ + and trainer.learning_rate <= args.train_min_lr: + logging.info("Early stopping threshold reached. Stop training.") + break + if no_better_valid > args.train_decay_patience: + new_lr = max(trainer.learning_rate * args.train_lr_decay_factor, args.train_min_lr) + if new_lr < trainer.learning_rate: + logging.info("\tChange the LR to %g" % new_lr) + trainer.set_learning_rate(new_lr) + no_better_valid = 0 + if iter_idx % args.train_log_interval == 0: + print(logging_str) + print('Best Iter Idx={}, Best Valid RMSE={:.4f}, Best Test RMSE={:.4f}'.format( + best_iter, best_valid_rmse, best_test_rmse)) + train_loss_logger.close() + valid_loss_logger.close() + test_loss_logger.close() + + +def config(): + parser = argparse.ArgumentParser(description='Run the baseline method.') + + parser.add_argument('--seed', default=123, type=int) + parser.add_argument('--ctx', dest='ctx', default='gpu0', type=str, + help='Running Context. E.g `--ctx gpu` or `--ctx gpu0,gpu1` or `--ctx cpu`') + parser.add_argument('--save_dir', type=str, help='The saving directory') + parser.add_argument('--save_id', type=int, help='The saving log id') + parser.add_argument('--silent', action='store_true') + + parser.add_argument('--data_name', default='ml-1m', type=str, + help='The dataset name: ml-100k, ml-1m, ml-10m') + parser.add_argument('--data_test_ratio', type=float, default=0.1) ## for ml-100k the test ration is 0.2 + parser.add_argument('--data_valid_ratio', type=float, default=0.1) + parser.add_argument('--use_one_hot_fea', action='store_true', default=False) + + #parser.add_argument('--model_remove_rating', type=bool, default=False) + parser.add_argument('--model_activation', type=str, default="leaky") + + parser.add_argument('--gcn_dropout', type=float, default=0.7) + parser.add_argument('--gcn_agg_norm_symm', type=bool, default=True) + parser.add_argument('--gcn_agg_units', type=int, default=500) + parser.add_argument('--gcn_agg_accum', type=str, default="sum") + parser.add_argument('--gcn_out_units', type=int, default=75) + + parser.add_argument('--gen_r_num_basis_func', type=int, default=2) + + # parser.add_argument('--train_rating_batch_size', type=int, default=10000) + parser.add_argument('--train_max_iter', type=int, default=2000) + parser.add_argument('--train_log_interval', type=int, default=1) + parser.add_argument('--train_valid_interval', type=int, default=1) + parser.add_argument('--train_optimizer', type=str, default="adam") + parser.add_argument('--train_grad_clip', type=float, default=1.0) + parser.add_argument('--train_lr', type=float, default=0.01) + parser.add_argument('--train_min_lr', type=float, default=0.001) + parser.add_argument('--train_lr_decay_factor', type=float, default=0.5) + parser.add_argument('--train_decay_patience', type=int, default=50) + parser.add_argument('--train_early_stopping_patience', type=int, default=100) + parser.add_argument('--share_param', default=False, action='store_true') + + args = parser.parse_args() + return args + +if __name__ == '__main__': + args = config() + + args.ctx = parse_ctx(args.ctx)[0] + print(args.ctx) + + ### configure save_dir to save all the info + if args.save_dir is None: + args.save_dir = args.data_name+"_" + ''.join(random.choices(string.ascii_uppercase + string.digits, k=2)) + if args.save_id is None: + args.save_id = np.random.randint(20) + args.save_dir = os.path.join(os.environ['SM_MODEL_DIR'], args.save_dir) + if not os.path.isdir(args.save_dir): + os.makedirs(args.save_dir) + + # PIN the seed + if args.seed != -1: + np.random.seed(args.seed) + mx.random.seed(args.seed, args.ctx) + train(args) diff --git a/sagemaker-python-sdk/dgl_gcmc/utils.py b/sagemaker-python-sdk/dgl_gcmc/utils.py new file mode 100644 index 0000000000..b56e37656d --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcmc/utils.py @@ -0,0 +1,79 @@ +import ast +import os +import csv +import inspect +import logging +import re +import mxnet.ndarray as nd +from mxnet import gluon +from mxnet.gluon import nn +import mxnet as mx +import numpy as np +from collections import OrderedDict + +class MetricLogger(object): + def __init__(self, attr_names, parse_formats, save_path): + self._attr_format_dict = OrderedDict(zip(attr_names, parse_formats)) + self._file = open(save_path, 'w') + self._csv = csv.writer(self._file) + self._csv.writerow(attr_names) + self._file.flush() + + def log(self, **kwargs): + self._csv.writerow([parse_format % kwargs[attr_name] + for attr_name, parse_format in self._attr_format_dict.items()]) + self._file.flush() + + def close(self): + self._file.close() + +def parse_ctx(ctx_args): + ctx = re.findall('([a-z]+)(\d*)', ctx_args) + ctx = [(device, int(num)) if len(num) > 0 else (device, 0) for device, num in ctx] + ctx = [mx.Context(*ele) for ele in ctx] + return ctx + + +def gluon_total_param_num(net): + return sum([np.prod(v.shape) for v in net.collect_params().values()]) + + +def gluon_net_info(net, save_path=None): + info_str = 'Total Param Number: {}\n'.format(gluon_total_param_num(net)) +\ + 'Params:\n' + for k, v in net.collect_params().items(): + info_str += '\t{}: {}, {}\n'.format(k, v.shape, np.prod(v.shape)) + info_str += str(net) + if save_path is not None: + with open(save_path, 'w') as f: + f.write(info_str) + return info_str + + +def params_clip_global_norm(param_dict, clip, ctx): + grads = [p.grad(ctx) for p in param_dict.values()] + gnorm = gluon.utils.clip_global_norm(grads, clip) + return gnorm + +def get_activation(act): + """Get the activation based on the act string + + Parameters + ---------- + act: str or HybridBlock + + Returns + ------- + ret: HybridBlock + """ + if act is None: + return lambda x: x + if isinstance(act, str): + if act == 'leaky': + return nn.LeakyReLU(0.1) + elif act in ['relu', 'sigmoid', 'tanh', 'softrelu', 'softsign']: + return nn.Activation(act) + else: + raise NotImplementedError + else: + return act diff --git a/sagemaker-python-sdk/dgl_gcn/README.md b/sagemaker-python-sdk/dgl_gcn/README.md new file mode 100644 index 0000000000..ef61517611 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn/README.md @@ -0,0 +1,36 @@ +# Setup Deep Graph Library with Amazon SageMaker to create graph convolutional network examples +The steps here show how to run graph convolutional network (GCN) with Amazon SageMaker. For more information about Deep Graph Library (DGL) and GCN, see the [DGL documentation](https://docs.dgl.ai). + +## Setup conda environment for DGL (MXNet backend) +You can install a conda environment for DGL with MXNet backend with a CPU-build. + +To create this, use the following steps: +``` +# Clone python3 environment +conda create --name DGL_py36_mxnet1.5 --clone python3 + +# Install MXNet and DGL (This is only CPU version) +source activate DGL_py36_mxnet1.5 +conda install -c anaconda scipy +conda install -c anaconda numpy +conda install -c anaconda numexpr +conda install -c anaconda blas=1.0=mkl mkl-service +conda install -c anaconda mkl_fft==1.0.1 mkl_random==1.0.1 +conda install -c anaconda numpy-base==1.16.0 scikit-learn mxnet=1.5.0 +conda install -c dglteam dgl +``` +You can select DGL_py36_mxnet1.5 conda environment now. + +## Setup a conda environment for DGL (PyTorch backend) +You can install a conda environment for DGL with PyTorch backend with GPU-build. + +To create this, use the following steps: +``` +# Clone python3 environment +conda create --name DGL_py36_pytorch1.2 --clone python3 + +# Install PyTorch and DGL +conda install --name DGL_py36_pytorch1.2 pytorch=1.2 torchvision -c pytorch +conda install --name DGL_py36_pytorch1.2 -c dglteam dgl-cuda10.0 +``` +You can select DGL_py36_pytorch1.2 conda environment now. diff --git a/sagemaker-python-sdk/dgl_gcn/mxnet_gcn.ipynb b/sagemaker-python-sdk/dgl_gcn/mxnet_gcn.ipynb new file mode 100644 index 0000000000..bb1f155566 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn/mxnet_gcn.ipynb @@ -0,0 +1,165 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training Amazon SageMaker models by using the Deep Graph Library with MXNet backend\n", + "The **Amazon SageMaker Python SDK** makes it easy to train Deep Graph Library (DGL) models. In this example, you train a graph neural network by using the [DMLC DGL API](https://github.com/dmlc/dgl.git) and the [Cora dataset](https://relational.fit.cvut.cz/dataset/CORA). The Cora dataset describes a citation network. The Cora dataset consists of 2,708 scientific publications classified into one of seven classes. The citation network consists of 5,429 links. The task is to train a node classification model using Cora dataset. \n", + "\n", + "For more information about Graph Neural Network and this example, see https://docs.dgl.ai/en/latest/tutorials/models/1_gnn/1_gcn.html\n", + "\n", + "### Prepare for training\n", + "To get started, install necessary packages." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!conda install -y boto3\n", + "!conda install -c anaconda -y botocore" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Setup\n", + "Define a few variables that will be needed later in the example." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here if you wish.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# IAM execution role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# You can use the Amazon SageMaker Python SDK to get the role from our notebook environment. \n", + "role = get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### The training script\n", + "The mxnet_gcn.py script provides all the code you need for training an Amazon SageMaker model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!cat mxnet_gcn.py" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### SageMaker's estimator class\n", + "The Amazon SageMaker Estimator allows us to run a single machine in Amazon SageMaker, using CPU or GPU-based instances.\n", + "\n", + "When you create the estimator, pass in the file name of our training script and the name of our IAM execution role. You can also provide a few other parameters. train_instance_count and train_instance_type determine the number and type of Amazon SageMaker instances that will be used for the training job. The hyperparameters parameter is a dictionary of values that is passed to your training script as parameters so that you can use argparse to parse them. You can see how to access these values in the mxnet_gcn.py script above.\n", + "\n", + "Here you can use the official Docker image for this example, For more information, see https://docs.aws.amazon.com/dlami/latest/devguide/deep-learning-containers-images.html. You should get the latest mxnet-1.6.0-gpu-py3 or mxnet-1.6.0-cpu-py3 image from official Amazon Elastic Container Registry (Amazon ECR) and push it into your own ECR.\n", + "\n", + "For this example, choose one ml.p3.2xlarge instance. You can also use a CPU instance such as ml.c4.2xlarge for the CPU image." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.mxnet.estimator import MXNet\n", + "\n", + "CODE_PATH = 'mxnet_gcn.py'\n", + "\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "docker_name = 'beta-mxnet-training' # change this for your own ECR image name\n", + "docker_tag = '1.6.0-py3-gpu-build' # change this for your own ECR image tag\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:{}'.format(account, region, docker_name, docker_tag)\n", + "print(image)\n", + "\n", + "params = {}\n", + "params['dataset'] = 'cora'\n", + "estimator = MXNet(entry_point=CODE_PATH,\n", + " role=role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.p3.2xlarge', # 'ml.c4.2xlarge'\n", + " image_name=image,\n", + " hyperparameters=params,\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Running the Training Job\n", + "After we've constructed our Estimator object, fit it using Amazon SageMaker. The dataset will be automatically downloaded." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator.fit()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Output\n", + "You can get the model training output from the Amazon SageMaker console by searching for the training task and looking for the address of 'S3 model artifact'" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_mxnet_p36", + "language": "python", + "name": "conda_mxnet_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_gcn/mxnet_gcn.py b/sagemaker-python-sdk/dgl_gcn/mxnet_gcn.py new file mode 100644 index 0000000000..b468484f05 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn/mxnet_gcn.py @@ -0,0 +1,188 @@ +#!/usr/bin/env python +# coding: utf-8 + +"""GCN using DGL nn package +References: +- Semi-Supervised Classification with Graph Convolutional Networks +- Paper: https://arxiv.org/abs/1609.02907 +- Code: https://github.com/tkipf/gcn +""" +import mxnet as mx +from mxnet import gluon +import os +import argparse +import dgl +from dgl.nn.mxnet import GraphConv + +import time +import json +import numpy as np +from mxnet import gluon + +from dgl import DGLGraph +from dgl.data import register_data_args, load_data + +import collections +class GCN(gluon.Block): + def __init__(self, + g, + in_feats, + n_hidden, + n_classes, + n_layers, + activation, + dropout): + super(GCN, self).__init__() + self.g = g + self.layers = gluon.nn.Sequential() + # input layer + self.layers.add(GraphConv(in_feats, n_hidden, activation=activation)) + # hidden layers + for i in range(n_layers - 1): + self.layers.add(GraphConv(n_hidden, n_hidden, activation=activation)) + # output layer + self.layers.add(GraphConv(n_hidden, n_classes)) + self.dropout = gluon.nn.Dropout(rate=dropout) + + def forward(self, features): + h = features + for i, layer in enumerate(self.layers): + if i != 0: + h = self.dropout(h) + h = layer(self.g, h) + return h + +def evaluate(model, features, labels, mask): + pred = model(features).argmax(axis=1) + accuracy = ((pred == labels) * mask).sum() / mask.sum().asscalar() + return accuracy.asscalar() + +def main(args): + # load and preprocess dataset + data = load_data(args) + features = mx.nd.array(data.features) + labels = mx.nd.array(data.labels) + train_mask = mx.nd.array(data.train_mask) + val_mask = mx.nd.array(data.val_mask) + test_mask = mx.nd.array(data.test_mask) + in_feats = features.shape[1] + n_classes = data.num_labels + n_edges = data.graph.number_of_edges() + print("""----Data statistics------' + #Edges %d + #Classes %d + #Train samples %d + #Val samples %d + #Test samples %d""" % + (n_edges, n_classes, + train_mask.sum().asscalar(), + val_mask.sum().asscalar(), + test_mask.sum().asscalar())) + + if args.gpu < 0: + cuda = False + ctx = mx.cpu(0) + else: + cuda = True + ctx = mx.gpu(args.gpu) + + features = features.as_in_context(ctx) + labels = labels.as_in_context(ctx) + train_mask = train_mask.as_in_context(ctx) + val_mask = val_mask.as_in_context(ctx) + test_mask = test_mask.as_in_context(ctx) + + # create GCN model + g = data.graph + if args.self_loop: + g.remove_edges_from(g.selfloop_edges()) + g.add_edges_from(zip(g.nodes(), g.nodes())) + g = DGLGraph(g) + # normalization + degs = g.in_degrees().astype('float32') + norm = mx.nd.power(degs, -0.5) + if cuda: + norm = norm.as_in_context(ctx) + g.ndata['norm'] = mx.nd.expand_dims(norm, 1) + + model = GCN(g, + in_feats, + args.n_hidden, + n_classes, + args.n_layers, + mx.nd.relu, + args.dropout) + model.initialize(ctx=ctx) + n_train_samples = train_mask.sum().asscalar() + loss_fcn = gluon.loss.SoftmaxCELoss() + + # use optimizer + print(model.collect_params()) + trainer = gluon.Trainer(model.collect_params(), 'adam', + {'learning_rate': args.lr, 'wd': args.weight_decay}) + + # initialize graph + dur = [] + for epoch in range(args.n_epochs): + if epoch >= 3: + t0 = time.time() + # forward + with mx.autograd.record(): + pred = model(features) + loss = loss_fcn(pred, labels, mx.nd.expand_dims(train_mask, 1)) + loss = loss.sum() / n_train_samples + + loss.backward() + trainer.step(batch_size=1) + + if epoch >= 3: + loss.asscalar() + dur.append(time.time() - t0) + acc = evaluate(model, features, labels, val_mask) + print("Epoch {:05d} | Time(s) {:.4f} | Loss {:.4f} | Accuracy {:.4f} | " + "ETputs(KTEPS) {:.2f}". format( + epoch, np.mean(dur), loss.asscalar(), acc, n_edges / np.mean(dur) / 1000)) + + # test set accuracy + acc = evaluate(model, features, labels, test_mask) + print("Test accuracy {:.2%}".format(acc)) + + model.save_parameters(args.save_path) + +def parse_args(): + parser = argparse.ArgumentParser(description='GCN') + register_data_args(parser) + parser.add_argument("--dropout", type=float, default=0.5, + help="dropout probability") + parser.add_argument("--gpu", type=int, default=-1, + help="gpu") + parser.add_argument("--lr", type=float, default=3e-2, + help="learning rate") + parser.add_argument("--n-epochs", type=int, default=200, + help="number of training epochs") + parser.add_argument("--n-hidden", type=int, default=16, + help="number of hidden gcn units") + parser.add_argument("--n-layers", type=int, default=1, + help="number of hidden gcn layers") + parser.add_argument("--weight-decay", type=float, default=5e-4, + help="Weight for L2 loss") + parser.add_argument("--self-loop", action='store_true', + help="graph self-loop (default=False)") + parser.add_argument("--save-path", type=str, default='./model/gcn.params', + help="path to save model") + parser.set_defaults(self_loop=False) + + return parser.parse_args() + +if __name__ == '__main__': + args = parse_args() + num_gpus = int(os.environ['SM_NUM_GPUS']) + if num_gpus == 0: + args.gpu = -1 + else: + args.gpu = 0 + + path = str(os.environ['SM_MODEL_DIR']) + args.save_path = os.path.join(path, 'gcn.params') + print(args) + main(args) diff --git a/sagemaker-python-sdk/dgl_gcn/mxnet_gcn_hypertune.ipynb b/sagemaker-python-sdk/dgl_gcn/mxnet_gcn_hypertune.ipynb new file mode 100644 index 0000000000..741dec1152 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn/mxnet_gcn_hypertune.ipynb @@ -0,0 +1,279 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Hyperparameter tuning with Amazon SageMaker and Deep Graph Library with MXNet backend\n", + "_**Creating a Hyperparameter tuning job for a DGL network**_\n", + "___\n", + "___\n", + "\n", + "\n", + "## Contents\n", + "1. [Background](#Background) \n", + "2. [Setup](#Setup) \n", + "3. [Code](#Code) \n", + "4. [Tune](#Train) \n", + "5. [Wrap-up](#Wrap-up) \n", + "\n", + "## Background\n", + "This example notebook shows how to create a graph neural network model to train the [Cora dataset] by using DGL with MXNet backend. It uses the Amazon SageMaker hyperparameter tuning to start multiple training jobs with different hyperparameter combinations. This helps you find the set with best model performance. This is an important step in the machine learning process as hyperparameter settings can have a large effect on model accuracy. In this example, you use the [Amazon SageMaker Python SDK](https://github.com/aws/sagemaker-python-sdk) to create a hyperparameter tuning job for an Amazon SageMaker estimator." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup\n", + "This notebook was created and tested on an ml.p3.2xlarge notebook instance.\n", + "\n", + "Prerequisites\n", + " * You can successfully run the mxnet_gcn example (see mxnet_gcn.ipynb).\n", + " * You have an S3 bucket and prefix that you want to use for training and model data. This should be within the same Region as the notebook instance, training, and hosting.\n", + " * You have the IAM role ARN used to give training and hosting access to your data. See the documentation for more details on creating these. If a role not associated with the current notebook instance, or more than one role, is required for training or hosting, replace sagemaker.get_execution_role() with the appropriate full IAM role ARN strings.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here if you wish.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# IAM execution role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# You can use the Amazon SageMaker Python SDK to get the role from the notebook environment. \n", + "role = sagemaker.get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we'll import the Python libraries we'll need." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "from sagemaker.tuner import IntegerParameter, CategoricalParameter, ContinuousParameter, HyperparameterTuner" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Code\n", + "To use Amazon SageMaker to run Docker containers, you need to provide a Python script for the container to run. In this example, mxnet_gcn.py provides all the code you need for training an Amazon SageMaker model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!cat mxnet_gcn.py" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After you specify and test the training script to ensure it works, start the tuning job. Testing can be done in either local mode or by using Amazon SageMaker training. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tune\n", + "Similar to training a single training job in Amazon SageMaker, you define the training estimator passing in the code scripts, IAM role, (per job) hardware configuration, and any hyperparameters you're not tuning." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.mxnet.estimator import MXNet\n", + "\n", + "CODE_PATH = 'mxnet_gcn.py'\n", + "\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "docker_name = 'beta-mxnet-training' # change this for your own ECR image name\n", + "docker_tag = '1.6.0-py3-gpu-build' # change this for your own ECR image tag\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:{}'.format(account, region, docker_name, docker_tag)\n", + "print(image)\n", + "\n", + "params = {}\n", + "params['dataset'] = 'cora'\n", + "estimator = MXNet(entry_point=CODE_PATH,\n", + " role=role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.p3.2xlarge',\n", + " image_name=image,\n", + " hyperparameters=params,\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After you define your estimator, specify the hyperparameters you want to tune and their possible values. You have three different types of hyperparameters.\n", + " * Categorical parameters need to take one value from a discrete set. Define this by passing the list of possible values to CategoricalParameter(list)\n", + " * Continuous parameters can take any real number value between the minimum and maximum value, defined by ContinuousParameter(min, max)\n", + " * Integer parameters can take any integer value between the minimum and maximum value, defined by IntegerParameter(min, max)\n", + " \n", + "If possible, it's almost always best to specify a value as the least restrictive type. For example, tuning threshold as a continuous value between 0.01 and 0.2 is likely to yield a better result than tuning as a categorical parameter with possible values of 0.01, 0.1, 0.15, or 0.2." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hyperparameter_ranges = {'lr': ContinuousParameter(0.001, 0.01),\n", + " 'n-epochs': IntegerParameter(100, 200)}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, specify the objective metric that you want to tune and its definition. This includes the regular expression needed to extract that metric from the Amazon CloudWatch logs of the training job." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "objective_metric_name = 'Validation-accuracy'\n", + "metric_definitions = [{'Name': 'Validation-accuracy',\n", + " 'Regex': 'Test accuracy ([0-9\\\\.]+)%'}]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, create a HyperparameterTuner object, which you pass.\n", + "\n", + " * The training estimator you created above\n", + " * The hyperparameter ranges\n", + " * Objective metric name and definition\n", + " * Number of training jobs to run in-total and how many training jobs should be run simultaneously. More parallel jobs will finish tuning sooner, but may sacrifice accuracy. We recommend that you set the parallel jobs value to less than 10 percent of the total number of training jobs It's set it higher in this example to keep it short.\n", + " * Whether you should maximize or minimize the objective metric. You haven't specified here since it defaults to 'Maximize', which is what you want for validation accuracy" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tuner = HyperparameterTuner(estimator,\n", + " objective_metric_name,\n", + " hyperparameter_ranges,\n", + " metric_definitions,\n", + " max_jobs=6,\n", + " max_parallel_jobs=2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And finally, you can start the tuning job by calling .fit()." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tuner.fit()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Run a quick check of the hyperparameter tuning jobs status to make sure it started successfully and is InProgress." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "boto3.client('sagemaker').describe_hyper_parameter_tuning_job(\n", + " HyperParameterTuningJobName=tuner.latest_tuning_job.job_name)['HyperParameterTuningJobStatus']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Wrap-up\n", + "Now that we've started the hyperparameter tuning job, it will run in the background. You can close this notebook. When it's finished, you can go to console to analyze the result.\n", + "\n", + "For more information about Amazon SageMaker's Hyperparameter Tuning, see the AWS documentation." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_mxnet_p36", + "language": "python", + "name": "conda_mxnet_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_gcn/pytorch_gcn.ipynb b/sagemaker-python-sdk/dgl_gcn/pytorch_gcn.ipynb new file mode 100644 index 0000000000..5da203e2e7 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn/pytorch_gcn.ipynb @@ -0,0 +1,150 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training Amazon SageMaker models by using the Deep Graph Library with PyTorch backend\n", + "The **Amazon SageMaker Python SDK** makes it easy to train Deep Graph Library (DGL) models. In this example, you train a simple graph neural network using the [DMLC DGL API](https://github.com/dmlc/dgl.git) and the [Cora dataset](https://relational.fit.cvut.cz/dataset/CORA). The Cora dataset describes a citation network. The Cora dataset consists of 2,708 scientific publications classified into one of seven classes. The citation network consists of 5,429 links. The task is to train a node classification model using Cora dataset. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Setup\n", + "Define a few variables that are needed later in the example." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# IAM execution role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# You can use the Amazon SageMaker Python SDK to get the role from the notebook environment. \n", + "role = get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### The training script\n", + "The pytorch_gcn.py script provides all the code you need for training an Amazon SageMaker model. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!cat pytorch_gcn.py" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### SageMaker's estimator class\n", + "The Amazon SageMaker Estimator allows you to run single machine in Amazon SageMaker, using CPU or GPU-based instances.\n", + "\n", + "When you create the estimator, pass in the filename of the training script and the name of the IAM execution role. You can also provide a few other parameters. train_instance_count and train_instance_type determine the number and type of Amazon SageMaker instances that are used for the training job. The hyperparameters parameter is a dictionary of values that is passed to your training script as parameters so that you can use argparse to parse them. You can see how to access these values in the pytorch_gcn.py script above.\n", + "\n", + "Here, you can use the official Docker image for this example. For more information, see https://docs.aws.amazon.com/dlami/latest/devguide/deep-learning-containers-images.html. You should get the latest pytorch-1.3.1-gpu-py3 or pytorch-1.3.1-cpu-py3 image from official Amazon Elastic Container Registry (Amazon ECR) and push it into your own ECR.\n", + "\n", + "For this example, choose one ml.p3.2xlarge instance. You can also use a CPU instance such as ml.c4.2xlarge for the CPU image." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.pytorch import PyTorch\n", + "\n", + "CODE_PATH = 'pytorch_gcn.py'\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "\n", + "docker_name = 'beta-pytorch-training' # change this for your own ECR image name\n", + "docker_tag = '1.3.1-py3-gpu-with-horovod-build' # change this for your own ECR image tag\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:{}'.format(account, region, docker_name, docker_tag)\n", + "print(image)\n", + "\n", + "params = {}\n", + "params['dataset'] = 'cora'\n", + "estimator = PyTorch(entry_point=CODE_PATH,\n", + " role=role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.p3.2xlarge', # 'ml.c4.2xlarge '\n", + " image_name=image,\n", + " hyperparameters=params,\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Running the Training Job\n", + "After you construct the Estimator object, fit it by using Amazon SageMaker. The dataset is automatically downloaded." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator.fit()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Output\n", + "You can get the model training output from the Amazon Sagemaker console by searching for the training task named pytorch-gcn and looking for the address of 'S3 model artifact'" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_dgl_py36_mxnet1.5", + "language": "python", + "name": "conda_dgl_py36_mxnet1.5" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_gcn/pytorch_gcn.py b/sagemaker-python-sdk/dgl_gcn/pytorch_gcn.py new file mode 100644 index 0000000000..dd8d9784f4 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn/pytorch_gcn.py @@ -0,0 +1,191 @@ +import torch +import torch.nn as nn +from dgl.nn.pytorch import GraphConv + +import os +import time +import json +import argparse +import numpy as np +import torch.nn.functional as F +from dgl import DGLGraph +from dgl.data import register_data_args, load_data + +# define GCN layer +class GCN(nn.Module): + def __init__(self, + g, + in_feats, + n_hidden, + n_classes, + n_layers, + activation, + dropout): + super(GCN, self).__init__() + self.g = g + self.layers = nn.ModuleList() + # input layer + self.layers.append(GraphConv(in_feats, n_hidden, activation=activation)) + # hidden layers + for i in range(n_layers - 1): + self.layers.append(GraphConv(n_hidden, n_hidden, activation=activation)) + # output layer + self.layers.append(GraphConv(n_hidden, n_classes)) + self.dropout = nn.Dropout(p=dropout) + + def forward(self, features): + h = features + for i, layer in enumerate(self.layers): + if i != 0: + h = self.dropout(h) + h = layer(self.g, h) + return h + +def evaluate(model, features, labels, mask): + model.eval() + with torch.no_grad(): + logits = model(features) + logits = logits[mask] + labels = labels[mask] + _, indices = torch.max(logits, dim=1) + correct = torch.sum(indices == labels) + return correct.item() * 1.0 / len(labels) + +def main(args): + # load and preprocess dataset + data = load_data(args) + features = torch.FloatTensor(data.features) + labels = torch.LongTensor(data.labels) + train_mask = torch.ByteTensor(data.train_mask) + val_mask = torch.ByteTensor(data.val_mask) + test_mask = torch.ByteTensor(data.test_mask) + in_feats = features.shape[1] + n_classes = data.num_labels + n_edges = data.graph.number_of_edges() + print("""----Data statistics------' + #Edges %d + #Classes %d + #Train samples %d + #Val samples %d + #Test samples %d""" % + (n_edges, n_classes, + train_mask.sum().item(), + val_mask.sum().item(), + test_mask.sum().item())) + + if args.gpu < 0: + cuda = False + else: + cuda = True + torch.cuda.set_device(args.gpu) + features = features.cuda() + labels = labels.cuda() + train_mask = train_mask.cuda() + val_mask = val_mask.cuda() + test_mask = test_mask.cuda() + + # graph preprocess and calculate normalization factor + g = data.graph + # add self loop + if args.self_loop: + g.remove_edges_from(g.selfloop_edges()) + g.add_edges_from(zip(g.nodes(), g.nodes())) + g = DGLGraph(g) + n_edges = g.number_of_edges() + # normalization + degs = g.in_degrees().float() + norm = torch.pow(degs, -0.5) + norm[torch.isinf(norm)] = 0 + if cuda: + norm = norm.cuda() + g.ndata['norm'] = norm.unsqueeze(1) + + # create GCN model + model = GCN(g, + in_feats, + args.n_hidden, + n_classes, + args.n_layers, + F.relu, + args.dropout) + + if cuda: + model.cuda() + loss_fcn = torch.nn.CrossEntropyLoss() + + # use optimizer + optimizer = torch.optim.Adam(model.parameters(), + lr=args.lr, + weight_decay=args.weight_decay) + + # initialize graph + dur = [] + for epoch in range(args.n_epochs): + model.train() + if epoch >= 3: + t0 = time.time() + # forward + logits = model(features) + loss = loss_fcn(logits[train_mask], labels[train_mask]) + + optimizer.zero_grad() + loss.backward() + optimizer.step() + + if epoch >= 3: + dur.append(time.time() - t0) + + acc = evaluate(model, features, labels, val_mask) + print("Epoch {:05d} | Time(s) {:.4f} | Loss {:.4f} | Accuracy {:.4f} | " + "ETputs(KTEPS) {:.2f}". format(epoch, np.mean(dur), loss.item(), + acc, n_edges / np.mean(dur) / 1000)) + + print() + acc = evaluate(model, features, labels, test_mask) + print("Test Accuracy {:.2%}".format(acc)) + + torch.save(model.state_dict(), args.save_path) + +import collections +import warnings +warnings.filterwarnings("ignore") + +def parse_args(): + parser = argparse.ArgumentParser(description='GCN') + register_data_args(parser) + parser.add_argument("--dropout", type=float, default=0.5, + help="dropout probability") + parser.add_argument("--gpu", type=int, default=-1, + help="gpu") + parser.add_argument("--lr", type=float, default=3e-2, + help="learning rate") + parser.add_argument("--n-epochs", type=int, default=200, + help="number of training epochs") + parser.add_argument("--n-hidden", type=int, default=16, + help="number of hidden gcn units") + parser.add_argument("--n-layers", type=int, default=1, + help="number of hidden gcn layers") + parser.add_argument("--weight-decay", type=float, default=5e-4, + help="Weight for L2 loss") + parser.add_argument("--self-loop", action='store_true', + help="graph self-loop (default=False)") + parser.add_argument("--save-path", type=str, default='./model/gcn.pt', + help="path to save model") + parser.set_defaults(self_loop=False) + + return parser.parse_args() + +if __name__ == '__main__': + args = parse_args() + + num_gpus = int(os.environ['SM_NUM_GPUS']) + if num_gpus == 0: + args.gpu = -1 + else: + args.gpu = 0 + + path = str(os.environ['SM_MODEL_DIR']) + args.save_path = os.path.join(path, 'gcn.pt') + + print(args) + main(args) diff --git a/sagemaker-python-sdk/dgl_gcn/pytorch_gcn_hypertune.ipynb b/sagemaker-python-sdk/dgl_gcn/pytorch_gcn_hypertune.ipynb new file mode 100644 index 0000000000..21f6623335 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn/pytorch_gcn_hypertune.ipynb @@ -0,0 +1,273 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Hyperparameter tuning with Amazon SageMaker and Deep Graph Library with PyTorch backend\n", + "_**Creating a Hyperparameter Tuning Job for an Deep Graph Library (DGL) Network**_\n", + "___\n", + "___\n", + "\n", + "\n", + "## Contents\n", + "1. [Background](#Background) \n", + "2. [Setup](#Setup) \n", + "3. [Code](#Code) \n", + "4. [Tune](#Train) \n", + "5. [Wrap-up](#Wrap-up) \n", + "\n", + "## Background\n", + "This example notebook focuses on how to create a graph neural network model to train the [Cora dataset] using DGL with PyTorch backend. It leverages SageMaker's hyperparameter tuning to kick off multiple training jobs with different hyperparameter combinations, to find the set with best model performance. This is an important step in the machine learning process as hyperparameter settings can have a large impact on model accuracy. In this example, you use the [Amazon SageMaker Python SDK](https://github.com/aws/sagemaker-python-sdk) to create a hyperparameter tuning job for an Amazon SageMaker estimator." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup\n", + "This notebook was created and tested on an ml.p3.2xlarge notebook instance.\n", + "\n", + "Prerequisites\n", + " * You can successfully run the pytorch_gcn example (see pytorch_gcn.ipynb).\n", + " * An S3 bucket and prefix that you want to use exists for training and model data. This should be within the same Region as the notebook instance, training, and hosting.\n", + " * You have the IAM role ARN used to give training and hosting access to your data. See the documentation for more details on creating these. If a role not associated with the current notebook instance, or more than one role is required for training and/or hosting, replace sagemaker.get_execution_role() with a the appropriate full IAM role arn string(s).\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here if you wish.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# IAM execution role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# You can use the Amazon SageMaker Python SDK to get the role from the notebook environment. \n", + "role = sagemaker.get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now import the Python libraries." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "from sagemaker.tuner import IntegerParameter, CategoricalParameter, ContinuousParameter, HyperparameterTuner" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Code\n", + "To use Amazon SageMaker to run Docker containers, provide an Python script for the container to run. In this example, pytorch_gcn.py provides all the code for training an Amazon SageMaker model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!cat pytorch_gcn.py" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After you specify and test the training script to ensure it works, you can start the tuning job. Testing can be done in either local mode or using Amazon SageMaker training." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tune\n", + "Similar to training a single training job in Amazon SageMaker, define the training estimator passing in the code scripts, IAM role, (per job) hardware configuration, and any hyperparameters you are not tuning." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.pytorch import PyTorch\n", + "\n", + "CODE_PATH = 'pytorch_gcn.py'\n", + "\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "\n", + "docker_name = 'beta-pytorch-training' # change this for your own ECR image name\n", + "docker_tag = '1.3.1-py3-gpu-with-horovod-build' # change this for your own ECR image tag\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:{}'.format(account, region, docker_name, docker_tag)\n", + "print(image)\n", + "\n", + "params = {}\n", + "params['dataset'] = 'cora'\n", + "estimator = PyTorch(entry_point=CODE_PATH,\n", + " role=role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.p3.2xlarge',\n", + " image_name=image,\n", + " hyperparameters=params,\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After you define the estimator, specify the hyperparameters you want to tune and their possible values. You have three different types of hyperparameters.\n", + " * Categorical parameters need to take one value from a discrete set. Define this by passing the list of possible values to CategoricalParameter(list)\n", + " * Continuous parameters can take any real number value between the minimum and maximum value, defined by ContinuousParameter(min, max)\n", + " * Integer parameters can take any integer value between the minimum and maximum value, defined by IntegerParameter(min, max)\n", + " \n", + "Note, if possible, it's almost always best to specify a value as the least restrictive type. For example, tuning threshold as a continuous value between 0.01 and 0.2 is likely to yield a better result than tuning as a categorical parameter with possible values of 0.01, 0.1, 0.15, or 0.2." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hyperparameter_ranges = {'lr': ContinuousParameter(0.001, 0.01),\n", + " 'n-epochs': IntegerParameter(100, 200)}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, specify the objective metric that you want to tune and its definition. This includes the regular expression (regex) needed to extract that metric from the Amazon CloudWatch logs of the training job" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "objective_metric_name = 'Validation-accuracy'\n", + "metric_definitions = [{'Name': 'Validation-accuracy',\n", + " 'Regex': 'Test Accuracy ([0-9\\\\.]+)%'}]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, create a HyperparameterTuner object, which you pass:\n", + "\n", + " * The training estimator you created above\n", + " * The hyperparameter ranges\n", + " * Objective metric name and definition\n", + " * Number of training jobs to run in total and how many training jobs should be run simultaneously. More parallel jobs will finish tuning sooner, but may sacrifice accuracy. We recommend you set the parallel jobs value to less than 10% of the total number of training jobs (we'll set it higher just for this example to keep it short).\n", + " * Whether you should maximize or minimize the objective metric. You haven't specified here since it defaults to 'Maximize', which is what you want for validation accuracy" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tuner = HyperparameterTuner(estimator,\n", + " objective_metric_name,\n", + " hyperparameter_ranges,\n", + " metric_definitions,\n", + " max_jobs=6,\n", + " max_parallel_jobs=2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And finally, start the tuning job by calling .fit()." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tuner.fit()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Let's just run a quick check of the hyperparameter tuning jobs status to make sure it started successfully and is InProgress." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "boto3.client('sagemaker').describe_hyper_parameter_tuning_job(\n", + " HyperParameterTuningJobName=tuner.latest_tuning_job.job_name)['HyperParameterTuningJobStatus']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Wrap-up\n", + "After you start the hyperparameter tuning job, it will run in the background. You can close this notebook. After it finishes, you can go to console to analyze the result.\n", + "\n", + "For more information about Amazon SageMaker's Hyperparameter tuning, see the AWS documentation." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_pytorch_p36", + "language": "python", + "name": "conda_pytorch_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_gcn_tox21/README.md b/sagemaker-python-sdk/dgl_gcn_tox21/README.md new file mode 100644 index 0000000000..b8434ecfd4 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn_tox21/README.md @@ -0,0 +1,33 @@ +# Deep Graph Library Amazon SageMaker GCN examples for molecular property prediction + +In this tutorial, you learn how to run graph convolutional networks (GCNs) for molecular property prediction by using Amazon SageMaker. + +With atoms being nodes and bonds being edges, molecules have been an important type of data in the application of +graph neural networks. In this example, you use the dataset **Tox21**. The +Toxicology in the 21st Century (Tox21) initiative created a public database measuring the toxicity of compounds. The +dataset contains qualitative toxicity measurements for 8,014 compounds on 12 different targets, including nuclear +receptors and stress response pathways. Each target yields a binary classification problem. Therefore, you can model the +problems as graph classification problems. The molecular benchmark [MoleculeNet](http://moleculenet.ai/) randomly splits the dataset into a training, validation, +and test set with a 80:10:10 ratio. This tutorial follows that approach. + +Use atom descriptors as initial node features. After updating node features as in usual GCN, combine the sum and +maximum of the updated node (atom) representations for graph (molecule) representations. Finally, use a +feedforward neural network (FNN) to make the predictions from the representations. + +For more information about DGL and GCN, see https://docs.dgl.ai + +## Setup a Conda environment for DGL (PyTorch backend) + +To install a Conda environment for DGL with a GPU-enabled PyTorch backend, use the following steps. +``` +# Clone Python3 environment + +conda create --name DGL_py36_pytorch1.2_chem --clone python3 + +# Install PyTorch and DGL +conda install --name DGL_py36_pytorch1.2_chem pytorch=1.2 torchvision -c pytorch +conda install --name DGL_py36_pytorch1.2_chem -c dglteam dgl-cuda10.0=0.4.0 +conda install --name DGL_py36_pytorch1.2_chem --update-deps --force libpng +conda install --name DGL_py36_pytorch1.2_chem --update-deps --force -c conda-forge rdkit=2018.09.3 +``` +You can select DGL_py36_pytorch1.2_chem Conda environment now. diff --git a/sagemaker-python-sdk/dgl_gcn_tox21/gcn_tox21_cpu.Dockerfile b/sagemaker-python-sdk/dgl_gcn_tox21/gcn_tox21_cpu.Dockerfile new file mode 100644 index 0000000000..49c4fa972c --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn_tox21/gcn_tox21_cpu.Dockerfile @@ -0,0 +1,4 @@ +From dgllib/dgl-sagemaker-cpu:dgl_0.4_pytorch_1.2.0_rdkit + +RUN pip install -U scikit-learn +RUN pip install pandas \ No newline at end of file diff --git a/sagemaker-python-sdk/dgl_gcn_tox21/gcn_tox21_gpu.Dockerfile b/sagemaker-python-sdk/dgl_gcn_tox21/gcn_tox21_gpu.Dockerfile new file mode 100644 index 0000000000..98e4dde536 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn_tox21/gcn_tox21_gpu.Dockerfile @@ -0,0 +1,4 @@ +From dgllib/dgl-sagemaker-gpu:dgl_0.4_pytorch_1.2.0_rdkit + +RUN pip install -U scikit-learn +RUN pip install pandas diff --git a/sagemaker-python-sdk/dgl_gcn_tox21/main.py b/sagemaker-python-sdk/dgl_gcn_tox21/main.py new file mode 100644 index 0000000000..a2b78a2827 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn_tox21/main.py @@ -0,0 +1,234 @@ +import argparse +import dgl +import json +import numpy as np +import os +import random +import torch + +from datetime import datetime +from dgl import model_zoo +from dgl.data.chem import Tox21 +from dgl.data.utils import split_dataset +from sklearn.metrics import roc_auc_score +from torch.nn import BCEWithLogitsLoss +from torch.optim import Adam +from torch.utils.data import DataLoader + +def setup(args, seed=0): + args['device'] = 'cuda' if torch.cuda.is_available() else 'cpu' + + # Set random seed + random.seed(seed) + np.random.seed(seed) + torch.manual_seed(seed) + if torch.cuda.is_available(): + torch.cuda.manual_seed(seed) + return args + +def collate_molgraphs(data): + """Batching a list of datapoints for dataloader.""" + smiles, graphs, labels, masks = map(list, zip(*data)) + + bg = dgl.batch(graphs) + bg.set_n_initializer(dgl.init.zero_initializer) + bg.set_e_initializer(dgl.init.zero_initializer) + labels = torch.stack(labels, dim=0) + masks = torch.stack(masks, dim=0) + return smiles, bg, labels, masks + +class EarlyStopper(object): + def __init__(self, patience, filename=None): + if filename is None: + # Name checkpoint based on time + dt = datetime.now() + filename = 'early_stop_{}_{:02d}-{:02d}-{:02d}.pth'.format( + dt.date(), dt.hour, dt.minute, dt.second) + filename = os.path.join('/opt/ml/model', filename) + + self.patience = patience + self.counter = 0 + self.filename = filename + self.best_score = None + self.early_stop = False + + def save_checkpoint(self, model): + '''Saves model when the metric on the validation set gets improved.''' + torch.save({'model_state_dict': model.state_dict()}, self.filename) + + def load_checkpoint(self, model): + '''Load model saved with early stopping.''' + model.load_state_dict(torch.load(self.filename)['model_state_dict']) + + def step(self, score, model): + if (self.best_score is None) or (score > self.best_score): + self.best_score = score + self.save_checkpoint(model) + self.counter = 0 + else: + self.counter += 1 + print('EarlyStopping counter: {:d} out of {:d}'.format(self.counter, self.patience)) + if self.counter >= self.patience: + self.early_stop = True + return self.early_stop + +class Meter(object): + """Track and summarize model performance on a dataset for + (multi-label) binary classification.""" + def __init__(self): + self.mask = [] + self.y_pred = [] + self.y_true = [] + + def update(self, y_pred, y_true, mask): + """Update for the result of an iteration + + Parameters + ---------- + y_pred : float32 tensor + Predicted molecule labels with shape (B, T), + B for batch size and T for the number of tasks + y_true : float32 tensor + Ground truth molecule labels with shape (B, T) + mask : float32 tensor + Mask for indicating the existence of ground + truth labels with shape (B, T) + """ + self.y_pred.append(y_pred.detach().cpu()) + self.y_true.append(y_true.detach().cpu()) + self.mask.append(mask.detach().cpu()) + + def roc_auc_score(self): + """Compute roc-auc score for each task. + + Returns + ------- + list of float + roc-auc score for all tasks + """ + mask = torch.cat(self.mask, dim=0) + y_pred = torch.cat(self.y_pred, dim=0) + y_true = torch.cat(self.y_true, dim=0) + # This assumes binary case only + y_pred = torch.sigmoid(y_pred) + n_tasks = y_true.shape[1] + scores = [] + for task in range(n_tasks): + task_w = mask[:, task] + task_y_true = y_true[:, task][task_w != 0].numpy() + task_y_pred = y_pred[:, task][task_w != 0].numpy() + scores.append(roc_auc_score(task_y_true, task_y_pred)) + return scores + +def run_a_train_epoch(args, epoch, model, data_loader, loss_criterion, optimizer): + model.train() + train_meter = Meter() + for batch_id, batch_data in enumerate(data_loader): + smiles, bg, labels, masks = batch_data + atom_feats = bg.ndata.pop(args['atom_data_field']) + atom_feats, labels, masks = atom_feats.to(args['device']), \ + labels.to(args['device']), \ + masks.to(args['device']) + logits = model(bg, atom_feats) + # Mask non-existing labels + loss = (loss_criterion(logits, labels) * (masks != 0).float()).mean() + optimizer.zero_grad() + loss.backward() + optimizer.step() + print('epoch {:d}/{:d}, batch {:d}/{:d}, loss {:.4f}'.format( + epoch + 1, args['n_epochs'], batch_id + 1, len(data_loader), loss.item())) + train_meter.update(logits, labels, masks) + train_score = np.mean(train_meter.roc_auc_score()) + print('epoch {:d}/{:d}, training roc-auc {:.4f}'.format( + epoch + 1, args['n_epochs'], train_score)) + +def run_an_eval_epoch(args, model, data_loader): + model.eval() + eval_meter = Meter() + with torch.no_grad(): + for batch_id, batch_data in enumerate(data_loader): + smiles, bg, labels, masks = batch_data + atom_feats = bg.ndata.pop(args['atom_data_field']) + atom_feats, labels = atom_feats.to(args['device']), labels.to(args['device']) + logits = model(bg, atom_feats) + eval_meter.update(logits, labels, masks) + return np.mean(eval_meter.roc_auc_score()) + +def load_sagemaker_config(args): + file_path = '/opt/ml/input/config/hyperparameters.json' + if os.path.isfile(file_path): + with open(file_path, 'r') as f: + new_args = json.load(f) + for k, v in new_args.items(): + if k not in args: + continue + if isinstance(args[k], int): + v = int(v) + if isinstance(args[k], float): + v = float(v) + args[k] = v + return args + +def main(args): + args = setup(args) + + dataset = Tox21() + train_set, val_set, test_set = split_dataset(dataset, shuffle=True) + train_loader = DataLoader(train_set, batch_size=args['batch_size'], + shuffle=True, collate_fn=collate_molgraphs) + val_loader = DataLoader(val_set, batch_size=args['batch_size'], + shuffle=True, collate_fn=collate_molgraphs) + test_loader = DataLoader(test_set, batch_size=args['batch_size'], + shuffle=True, collate_fn=collate_molgraphs) + + model = model_zoo.chem.GCNClassifier( + in_feats=args['n_input'], + gcn_hidden_feats=[args['n_hidden'] for _ in range(args['n_layers'])], + n_tasks=dataset.n_tasks, + classifier_hidden_feats=args['n_hidden']).to(args['device']) + loss_criterion = BCEWithLogitsLoss( + pos_weight=torch.tensor(dataset.task_pos_weights).to(args['device']), reduction='none') + optimizer = Adam(model.parameters(), lr=args['lr']) + stopper = EarlyStopper(args['patience']) + + for epoch in range(args['n_epochs']): + # Train + run_a_train_epoch(args, epoch, model, train_loader, loss_criterion, optimizer) + + # Validation and early stop + val_score = run_an_eval_epoch(args, model, val_loader) + early_stop = stopper.step(val_score, model) + print('epoch {:d}/{:d}, validation roc-auc {:.4f}, best validation roc-auc {:.4f}'.format( + epoch + 1, args['n_epochs'], val_score, stopper.best_score)) + if early_stop: + break + + stopper.load_checkpoint(model) + test_score = run_an_eval_epoch(args, model, test_loader) + print('Best validation score {:.4f}'.format(stopper.best_score)) + print('Test score {:.4f}'.format(test_score)) + +def parse_args(): + parser = argparse.ArgumentParser(description='GCN for Tox21') + parser.add_argument('--batch-size', type=int, default=128, + help='Number of graphs (molecules) per batch') + parser.add_argument('--lr', type=float, default=1e-3, + help='Learning rate') + parser.add_argument('--n-epochs', type=int, default=100, + help='Maximum number of training epochs') + parser.add_argument('--atom-data-field', type=str, default='h', + help='Name for storing atom features') + parser.add_argument('--n-input', type=int, default=74, + help='Size for input atom features') + parser.add_argument('--n-hidden', type=int, default=64, + help='Size for hidden representations') + parser.add_argument('--n-layers', type=int, default=2, + help='Number of hidden layers') + parser.add_argument('--patience', type=int, default=10, + help='Number of epochs to wait before early stop') + return parser.parse_args().__dict__ + +if __name__ == '__main__': + args = parse_args() + args = load_sagemaker_config(args) + main(args) diff --git a/sagemaker-python-sdk/dgl_gcn_tox21/pytorch-gcn-tox21-hypertune.ipynb b/sagemaker-python-sdk/dgl_gcn_tox21/pytorch-gcn-tox21-hypertune.ipynb new file mode 100644 index 0000000000..e2dba10856 --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn_tox21/pytorch-gcn-tox21-hypertune.ipynb @@ -0,0 +1,260 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Hyperparameter tuning with Amazon SageMaker for molecular property prediction" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Contents\n", + "\n", + "1. [Background](##Background) \n", + "2. [Setup](##Setup) \n", + "3. [Code](##Code) \n", + "4. [Tune](##Tune) \n", + "5. [Wrap-up](##Wrap-up) \n", + "\n", + "## Background\n", + "\n", + "This example notebook demonstrates a graph-based molecular property prediction model with automatic hyperparameter tuning. The implementation is based on DGL and PyTorch. To find the best hyperparameters, it leverages SageMaker to kick off multiple training jobs with different hyperparameter combinations. In this example, you use the [Amazon SageMaker Python SDK](https://github.com/aws/sagemaker-python-sdk) to create a hyperparameter tuning job." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup\n", + "\n", + "This notebook was created and tested on an ml.p3.2xlarge notebook instance.\n", + "\n", + + "Prerequisites\n", + " * Before you start this tutorial, review the `pytorch-gcn-tox21.ipynb` example and ensure you have an account under your Amazon Elastic Container Registry (Amazon ECR) specified by \\{account\\}.dkr.ecr.\\{region\\}.amazonaws.com/sagemaker-dgl-pytorch-gcn-tox21:latest.\n", + " * An S3 bucket and prefix exists that you want to use for training and model data. This should be within the same Region as the notebook instance, training, and hosting.\n", + " * An IAM role ARN exists that you are going to use to give training and hosting access to your data. See the documentation for more details on creating these. Note that if a role is not associated with the current notebook instance, or more than one role is required for training or hosting, you should replace sagemaker.get_execution_role() with the appropriate full IAM role ARN strings." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here if you wish.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# IAM execution role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# Use the Amazon SageMaker Python SDK to get the role from the notebook environment. \n", + "role = get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Code\n", + "\n", + "To run Docker containers with Amazon SageMaker, provide a Python script for the container to run. In this example, `main.py` provides all the code you need to train an Amazon SageMaker model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!cat main.py" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tune\n", + "Similar to training a single training job in Amazon SageMaker, Define your training estimator passing in the code scripts, IAM role, (per job) hardware configuration, and any hyperparameters that you are not tuning.\n", + "\n", + "You must have a Docker image in your Amazon Elastic Container Registry (Amazon ECR) following steps in pytorch-gcn-tox21.ipynb." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Set target dgl-docker name\n", + "docker_name='sagemaker-dgl-pytorch-gcn-tox21'\n", + "\n", + "CODE_PATH = 'main.py'\n", + "code_location = sess.upload_data(CODE_PATH, bucket=bucket, key_prefix=custom_code_upload_location)\n", + "\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:latest'.format(account, region, docker_name)\n", + "\n", + "estimator = sagemaker.estimator.Estimator(image,\n", + " role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.p3.2xlarge',\n", + " hyperparameters={'entrypoint': CODE_PATH},\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After you define your estimator, specify the hyperparameters that you want to tune and their possible values. Depending on the type of possible values, the hyperparameters can be divided into three classes:\n", + "\n", + "* **Categorical**: Its possible values form a discrete set and is represented by `CategoricalParameter(list)`.\n", + "* **Continuous**: It can take any real number within an interval `[min, max]` and is represented by `ContinuousParameter(min, max)`.\n", + "* **Integer**: It can take any integer value within an interval `[min, max]` and is represented by `IntegerParameter(min, max)`.\n", + "\n", + "Note that it's almost always better to specify a value as the least restrictive type. For example, `ContinuousParameter(0.01, 0.2)` is less restrictive than `CategoricalParameter([0.01, 0.1, 0.15, 0.2])`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.tuner import IntegerParameter, CategoricalParameter, ContinuousParameter\n", + "\n", + "hyper_ranges = {'lr': ContinuousParameter(1e-4, 1e-2),\n", + " 'patience': IntegerParameter(5, 30),\n", + " 'n_hidden': CategoricalParameter([32, 64, 128])}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, specify the objective metric to tune and its definition. This includes the regular expression (regex) needed to extract that metric from the Amazon CloudWatch logs of the training job." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "objective_name = 'Validation_roc_auc'\n", + "metric_definitions = [{'Name': objective_name,\n", + " 'Regex': 'Best validation score ([0-9\\\\.]+)'}]" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, create a `HyperparameterTuner` object, which you pass:\n", + "\n", + " * The training estimator you created above\n", + " * The hyperparameter ranges\n", + " * Objective metric name and definition\n", + " * Number of training jobs to run in total and how many training jobs should be run simultaneously. More parallel jobs will finish tuning sooner, but may sacrifice accuracy. We recommend you set the parallel jobs value to less than 10 percent of the total number of training jobs. It is set higher just for this example to keep it short.\n", + " * Whether you should maximize or minimize the objective metric. You haven't specified here since it defaults to 'Maximize', which is what you want for validation roc-auc)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.tuner import HyperparameterTuner\n", + "\n", + "tuner = HyperparameterTuner(estimator,\n", + " objective_name,\n", + " hyper_ranges,\n", + " metric_definitions,\n", + " max_jobs=6,\n", + " max_parallel_jobs=2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, start the tuning job by calling `.fit()`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tuner.fit(inputs={'training-code': code_location})" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Check the hyperparameter tuning jobs status to make sure it started successfully and is InProgress." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "\n", + "boto3.client('sagemaker').describe_hyper_parameter_tuning_job(\n", + " HyperParameterTuningJobName=tuner.latest_tuning_job.job_name)['HyperParameterTuningJobStatus']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Wrap-up\n", + "After the hyperparameter tuning job is started, it runs in the background and you can close this notebook. When it's finished, you can go to console to analyze the result.\n", + "\n", + "For more information about Amazon SageMaker's Hyperparameter Tuning, see the AWS documentation." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_pytorch_p36", + "language": "python", + "name": "conda_pytorch_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_gcn_tox21/pytorch-gcn-tox21.ipynb b/sagemaker-python-sdk/dgl_gcn_tox21/pytorch-gcn-tox21.ipynb new file mode 100644 index 0000000000..30ee98841b --- /dev/null +++ b/sagemaker-python-sdk/dgl_gcn_tox21/pytorch-gcn-tox21.ipynb @@ -0,0 +1,211 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Training Amazon SageMaker models for molecular property prediction by using DGL with PyTorch backend\n", + "\n", + "The **Amazon SageMaker Python SDK** makes it easy to train Deep Graph Library (DGL) models. In this example, you train a simple graph neural network for molecular toxicity prediction by using [DGL](https://github.com/dmlc/dgl) and the Tox21 dataset.\n", + "\n", + "The dataset contains qualitative toxicity measurements for 8,014 compounds on 12 different targets, including nuclear \n", + "receptors and stress-response pathways. Each target yields a binary classification problem. You can model the problem as a graph classification problem. " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Define a few variables that you need later in the example." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here if you wish.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# IAM execution role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# You can use the Amazon SageMaker Python SDK to get the role from the notebook environment. \n", + "role = get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training Script" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`main.py` provides all the code you need for training a molecular property prediction model by using Amazon SageMaker." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!cat main.py" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Bring Your Own Image for Amazon SageMaker\n", + "\n", + "In this example, you need rdkit library to handle the tox21 dataset. The DGL CPU and GPU Docker has the rdkit library pre-installed at Dockerhub under dgllib registry (namely, dgllib/dgl-sagemaker-cpu:dgl_0.4_pytorch_1.2.0_rdkit for CPU and dgllib/dgl-sagemaker-gpu:dgl_0.4_pytorch_1.2.0_rdkit for GPU). You can pull the image yourself according to your requirement and push it into your AWS ECR. Following script helps you to do so. You can skip this step if you have already prepared your DGL Docker image in your Amazon Elastic Container Registry (Amazon ECR)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%sh\n", + "# For CPU default_docker_name=\"dgllib/dgl-sagemaker-cpu:dgl_0.4_pytorch_1.2.0_rdkit\"\n", + "default_docker_name=\"dgllib/dgl-sagemaker-gpu:dgl_0.4_pytorch_1.2.0_rdkit\"\n", + "docker pull $default_docker_name\n", + "\n", + "docker_name=sagemaker-dgl-pytorch-gcn-tox21\n", + "\n", + "# For CPU docker build -t $docker_name -f gcn_tox21_cpu.Dockerfile .\n", + "docker build -t $docker_name -f gcn_tox21_gpu.Dockerfile .\n", + "\n", + "account=$(aws sts get-caller-identity --query Account --output text)\n", + "echo $account\n", + "region=$(aws configure get region)\n", + "\n", + "fullname=\"${account}.dkr.ecr.${region}.amazonaws.com/${docker_name}:latest\"\n", + "# If the repository doesn't exist in ECR, create it.\n", + "aws ecr describe-repositories --repository-names \"${docker_name}\" > /dev/null 2>&1\n", + "if [ $? -ne 0 ]\n", + "then\n", + " aws ecr create-repository --repository-name \"${docker_name}\" > /dev/null\n", + "fi\n", + "\n", + "# Get the login command from ECR and execute it directly\n", + "$(aws ecr get-login --region ${region} --no-include-email)\n", + "\n", + "docker tag ${docker_name} ${fullname}\n", + "docker push ${fullname}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## The Amazon SageMaker Estimator class\n", + "\n", + "The Amazon SageMaker Estimator allows you to run a single machine in Amazon SageMaker, using CPU or GPU-based instances.\n", + "\n", + "When you create the estimator, pass in the file name of the training script and the name of the IAM execution role. Also provide a few other parameters. `train_instance_count` and `train_instance_type` determine the number and type of SageMaker instances that will be used for the training job. The hyperparameters can be passed to the training script via a dict of values. See `main.py` for how they are handled.\n", + "\n", + "The entrypoint of Amazon SageMaker Docker (e.g., dgllib/dgl-sagemaker-gpu:dgl_0.4_pytorch_1.2.0_rdkit) is a train script under /usr/bin/. The train script inside dgl docker image provided above will try to get the real entrypoint from the hyperparameters (with the key 'entrypoint') and run the real entrypoint under 'training-code' data channel (/opt/ml/input/data/training-code/) .\n", + "\n", + "For this example, choose one ml.p3.2xlarge instance. You can also use a CPU instance such as ml.c4.2xlarge for the CPU image." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "\n", + "# Set target dgl-docker name\n", + "docker_name='sagemaker-dgl-pytorch-gcn-tox21'\n", + "\n", + "CODE_PATH = 'main.py'\n", + "code_location = sess.upload_data(CODE_PATH, bucket=bucket, key_prefix=custom_code_upload_location)\n", + "\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:latest'.format(account, region, docker_name)\n", + "print(image)\n", + "\n", + "estimator = sagemaker.estimator.Estimator(image,\n", + " role, \n", + " train_instance_count=1, \n", + " train_instance_type= 'ml.p3.2xlarge', #'ml.c4.2xlarge'\n", + " hyperparameters={'entrypoint': CODE_PATH},\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Running the Training Job\n", + "\n", + "After you construct an Estimator object, fit it by using Amazon SageMaker. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator.fit({'training-code': code_location})" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Output\n", + "You can get the model training output from the Amazon Sagemaker console by searching for the training task and looking for the address of 'S3 model artifact'" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_pytorch_p36", + "language": "python", + "name": "conda_pytorch_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_kge/Readme.md b/sagemaker-python-sdk/dgl_kge/Readme.md new file mode 100644 index 0000000000..049159fbdb --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/Readme.md @@ -0,0 +1,6 @@ +# Example of Knowledge Graph Embedding Using Deep Graph Library +This is an example of knowledge graph embedding (KGE) using Deep Graph Library (DGL). It uses the FB15K dataset, a knowledge base of general facts. An example use case would be to graph the relationships of persons and predict their nationality. + +The example notebooks demonstrate implementations of [Apache MXNet](https://mxnet.apache.org/) and [PyTorch](https://pytorch.org/) for the [DGL](https://www.dgl.ai/) backend. They use Amazon SageMaker deep learning containers that are preconfigured with DGL, so you don't have to build the dependencies yourself. + +For more information about Deep Graph Library (DGL) please visit the DGL documentation website: https://docs.dgl.ai diff --git a/sagemaker-python-sdk/dgl_kge/dataloader/KGDataset.py b/sagemaker-python-sdk/dgl_kge/dataloader/KGDataset.py new file mode 100644 index 0000000000..c7c2492479 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/dataloader/KGDataset.py @@ -0,0 +1,139 @@ +import os + +def _download_and_extract(url, path, filename): + import shutil, zipfile + from tqdm import tqdm + import requests + + fn = os.path.join(path, filename) + + while True: + try: + with zipfile.ZipFile(fn) as zf: + zf.extractall(path) + print('Unzip finished.') + break + except Exception: + os.makedirs(path, exist_ok=True) + f_remote = requests.get(url, stream=True) + sz = f_remote.headers.get('content-length') + assert f_remote.status_code == 200, 'fail to open {}'.format(url) + with open(fn, 'wb') as writer: + for chunk in tqdm(f_remote.iter_content(chunk_size=1024*1024)): + writer.write(chunk) + print('Download finished. Unzipping the file...') + +class KGDataset1: + '''Load a knowledge graph with format 1 + + In this format, the folder with a knowledge graph has five files: + * entities.dict stores the mapping between entity Id and entity name. + * relations.dict stores the mapping between relation Id and relation name. + * train.txt stores the triples in the training set. + * valid.txt stores the triples in the validation set. + * test.txt stores the triples in the test set. + + The mapping between entity (relation) Id and entity (relation) name is stored as 'id\tname'. + + The triples are stored as 'head_name\trelation_name\ttail_name'. + ''' + def __init__(self, path, name): + url = 'https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/{}.zip'.format(name) + + if not os.path.exists(os.path.join(path, name)): + print('File not found. Downloading from', url) + _download_and_extract(url, path, name + '.zip') + path = os.path.join(path, name) + + with open(os.path.join(path, 'entities.dict')) as f: + entity2id = {} + for line in f: + eid, entity = line.strip().split('\t') + entity2id[entity] = int(eid) + + self.entity2id = entity2id + + with open(os.path.join(path, 'relations.dict')) as f: + relation2id = {} + for line in f: + rid, relation = line.strip().split('\t') + relation2id[relation] = int(rid) + + self.relation2id = relation2id + + # TODO: to deal with contries dataset. + + self.n_entities = len(self.entity2id) + self.n_relations = len(self.relation2id) + + self.train = self.read_triple(path, 'train') + self.valid = self.read_triple(path, 'valid') + self.test = self.read_triple(path, 'test') + + def read_triple(self, path, mode): + # mode: train/valid/test + triples = [] + with open(os.path.join(path, '{}.txt'.format(mode))) as f: + for line in f: + h, r, t = line.strip().split('\t') + triples.append((self.entity2id[h], self.relation2id[r], self.entity2id[t])) + + return triples + + +class KGDataset2: + '''Load a knowledge graph with format 2 + + In this format, the folder with a knowledge graph has five files: + * entity2id.txt stores the mapping between entity name and entity Id. + * relation2id.txt stores the mapping between relation name relation Id. + * train.txt stores the triples in the training set. + * valid.txt stores the triples in the validation set. + * test.txt stores the triples in the test set. + + The mapping between entity (relation) name and entity (relation) Id is stored as 'name\tid'. + + The triples are stored as 'head_nid\trelation_id\ttail_nid'. + ''' + def __init__(self, path, name): + url = 'https://s3.us-east-2.amazonaws.com/dgl.ai/dataset/{}.zip'.format(name) + + if not os.path.exists(os.path.join(path, name)): + print('File not found. Downloading from', url) + _download_and_extract(url, path, '{}.zip'.format(name)) + self.path = os.path.join(path, name) + + f_ent2id = os.path.join(self.path, 'entity2id.txt') + f_rel2id = os.path.join(self.path, 'relation2id.txt') + + with open(f_ent2id) as f_ent: + self.n_entities = int(f_ent.readline()[:-1]) + with open(f_rel2id) as f_rel: + self.n_relations = int(f_rel.readline()[:-1]) + + self.train = self.read_triple(self.path, 'train') + self.valid = self.read_triple(self.path, 'valid') + self.test = self.read_triple(self.path, 'test') + + def read_triple(self, path, mode, skip_first_line=False): + triples = [] + print('Reading {} triples....'.format(mode)) + with open(os.path.join(path, '{}.txt'.format(mode))) as f: + if skip_first_line: + _ = f.readline() + for line in f: + h, t, r = line.strip().split('\t') + triples.append((int(h), int(r), int(t))) + print('Finished. Read {} {} triples.'.format(len(triples), mode)) + return triples + + +def get_dataset(data_path, data_name, format_str): + if data_name == 'Freebase': + dataset = KGDataset2(data_path, data_name) + elif format_str == '1': + dataset = KGDataset1(data_path, data_name) + else: + dataset = KGDataset2(data_path, data_name) + + return dataset diff --git a/sagemaker-python-sdk/dgl_kge/dataloader/__init__.py b/sagemaker-python-sdk/dgl_kge/dataloader/__init__.py new file mode 100644 index 0000000000..d51548b06a --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/dataloader/__init__.py @@ -0,0 +1,2 @@ +from .KGDataset import * +from .sampler import * diff --git a/sagemaker-python-sdk/dgl_kge/dataloader/sampler.py b/sagemaker-python-sdk/dgl_kge/dataloader/sampler.py new file mode 100644 index 0000000000..6c470fbb70 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/dataloader/sampler.py @@ -0,0 +1,312 @@ +import math +import numpy as np +import scipy as sp +import dgl.backend as F +import dgl +import os +import pickle +import time + +# This partitions a list of edges based on relations to make sure +# each partition has roughly the same number of edges and relations. +def RelationPartition(edges, n): + print('relation partition {} edges into {} parts'.format(len(edges), n)) + rel = np.array([r for h, r, t in edges]) + uniq, cnts = np.unique(rel, return_counts=True) + idx = np.flip(np.argsort(cnts)) + cnts = cnts[idx] + uniq = uniq[idx] + assert cnts[0] > cnts[-1] + edge_cnts = np.zeros(shape=(n,), dtype=np.int64) + rel_cnts = np.zeros(shape=(n,), dtype=np.int64) + rel_dict = {} + for i in range(len(cnts)): + cnt = cnts[i] + r = uniq[i] + idx = np.argmin(edge_cnts) + rel_dict[r] = idx + edge_cnts[idx] += cnt + rel_cnts[idx] += 1 + for i, edge_cnt in enumerate(edge_cnts): + print('part {} has {} edges and {} relations'.format(i, edge_cnt, rel_cnts[i])) + parts = [] + for _ in range(n): + parts.append([]) + for h, r, t in edges: + idx = rel_dict[r] + parts[idx].append((h, r, t)) + return parts + +def RandomPartition(edges, n): + print('random partition {} edges into {} parts'.format(len(edges), n)) + idx = np.random.permutation(len(edges)) + part_size = int(math.ceil(len(idx) / n)) + parts = [] + for i in range(n): + start = part_size * i + end = min(part_size * (i + 1), len(idx)) + parts.append([edges[i] for i in idx[start:end]]) + return parts + +def ConstructGraph(edges, n_entities, i, args): + pickle_name = 'graph_train_{}.pickle'.format(i) + if args.pickle_graph and os.path.exists(os.path.join(args.data_path, args.dataset, pickle_name)): + with open(os.path.join(args.data_path, args.dataset, pickle_name), 'rb') as graph_file: + g = pickle.load(graph_file) + print('Load pickled graph.') + else: + src = [t[0] for t in edges] + etype_id = [t[1] for t in edges] + dst = [t[2] for t in edges] + coo = sp.sparse.coo_matrix((np.ones(len(src)), (src, dst)), shape=[n_entities, n_entities]) + g = dgl.DGLGraph(coo, readonly=True, sort_csr=True) + g.ndata['id'] = F.arange(0, g.number_of_nodes()) + g.edata['id'] = F.tensor(etype_id, F.int64) + if args.pickle_graph: + with open(os.path.join(args.data_path, args.dataset, pickle_name), 'wb') as graph_file: + pickle.dump(g, graph_file) + return g + +class TrainDataset(object): + def __init__(self, dataset, args, weighting=False, ranks=64): + triples = dataset.train + print('|Train|:', len(triples)) + if ranks > 1 and args.rel_part: + triples_list = RelationPartition(triples, ranks) + elif ranks > 1: + triples_list = RandomPartition(triples, ranks) + else: + triples_list = [triples] + self.graphs = [] + for i, triples in enumerate(triples_list): + g = ConstructGraph(triples, dataset.n_entities, i, args) + if weighting: + # TODO: weight to be added + count = self.count_freq(triples) + subsampling_weight = np.vectorize( + lambda h, r, t: np.sqrt(1 / (count[(h, r)] + count[(t, -r - 1)])) + ) + weight = subsampling_weight(src, etype_id, dst) + g.edata['weight'] = F.zerocopy_from_numpy(weight) + # to be added + self.graphs.append(g) + + def count_freq(self, triples, start=4): + count = {} + for head, rel, tail in triples: + if (head, rel) not in count: + count[(head, rel)] = start + else: + count[(head, rel)] += 1 + + if (tail, -rel - 1) not in count: + count[(tail, -rel - 1)] = start + else: + count[(tail, -rel - 1)] += 1 + return count + + def create_sampler(self, batch_size, neg_sample_size=2, mode='head', num_workers=5, + shuffle=True, exclude_positive=False, rank=0): + EdgeSampler = getattr(dgl.contrib.sampling, 'EdgeSampler') + return EdgeSampler(self.graphs[rank], + batch_size=batch_size, + neg_sample_size=neg_sample_size, + negative_mode=mode, + num_workers=num_workers, + shuffle=shuffle, + exclude_positive=exclude_positive, + return_false_neg=False) + +class PBGNegEdgeSubgraph(dgl.subgraph.DGLSubGraph): + def __init__(self, subg, num_chunks, chunk_size, + neg_sample_size, neg_head): + super(PBGNegEdgeSubgraph, self).__init__(subg._parent, subg.sgi) + self.subg = subg + self.num_chunks = num_chunks + self.chunk_size = chunk_size + self.neg_sample_size = neg_sample_size + self.neg_head = neg_head + + @property + def head_nid(self): + return self.subg.head_nid + + @property + def tail_nid(self): + return self.subg.tail_nid + + +def create_neg_subgraph(pos_g, neg_g, is_pbg, neg_head, num_nodes): + assert neg_g.number_of_edges() % pos_g.number_of_edges() == 0 + neg_sample_size = int(neg_g.number_of_edges() / pos_g.number_of_edges()) + # We use all nodes to create negative edges. Regardless of the sampling algorithm, + # we can always view the subgraph with one chunk. + if (neg_head and len(neg_g.head_nid) == num_nodes) \ + or (not neg_head and len(neg_g.tail_nid) == num_nodes): + num_chunks = 1 + chunk_size = pos_g.number_of_edges() + elif is_pbg: + if pos_g.number_of_edges() < neg_sample_size: + num_chunks = 1 + chunk_size = pos_g.number_of_edges() + else: + # This is probably the last batch. Let's ignore it. + if pos_g.number_of_edges() % neg_sample_size > 0: + return None + num_chunks = int(pos_g.number_of_edges()/ neg_sample_size) + chunk_size = neg_sample_size + else: + num_chunks = pos_g.number_of_edges() + chunk_size = 1 + return PBGNegEdgeSubgraph(neg_g, num_chunks, chunk_size, + neg_sample_size, neg_head) + +class EvalSampler(object): + def __init__(self, g, edges, batch_size, neg_sample_size, mode, num_workers): + EdgeSampler = getattr(dgl.contrib.sampling, 'EdgeSampler') + self.sampler = EdgeSampler(g, + batch_size=batch_size, + seed_edges=edges, + neg_sample_size=neg_sample_size, + negative_mode=mode, + num_workers=num_workers, + shuffle=False, + exclude_positive=False, + relations=g.edata['id'], + return_false_neg=True) + self.sampler_iter = iter(self.sampler) + self.mode = mode + self.neg_head = 'head' in mode + self.g = g + + def __iter__(self): + return self + + def __next__(self): + while True: + pos_g, neg_g = next(self.sampler_iter) + neg_positive = neg_g.edata['false_neg'] + neg_g = create_neg_subgraph(pos_g, neg_g, 'PBG' in self.mode, + self.neg_head, self.g.number_of_nodes()) + if neg_g is not None: + break + + pos_g.copy_from_parent() + neg_g.copy_from_parent() + neg_g.edata['bias'] = F.astype(-neg_positive, F.float32) + return pos_g, neg_g + + def reset(self): + self.sampler_iter = iter(self.sampler) + return self + +class EvalDataset(object): + def __init__(self, dataset, args): + triples = dataset.train + dataset.valid + dataset.test + pickle_name = 'graph_all.pickle' + if args.pickle_graph and os.path.exists(os.path.join(args.data_path, args.dataset, pickle_name)): + with open(os.path.join(args.data_path, args.dataset, pickle_name), 'rb') as graph_file: + g = pickle.load(graph_file) + print('Load pickled graph.') + else: + src = [t[0] for t in triples] + etype_id = [t[1] for t in triples] + dst = [t[2] for t in triples] + coo = sp.sparse.coo_matrix((np.ones(len(src)), (src, dst)), shape=[dataset.n_entities, dataset.n_entities]) + g = dgl.DGLGraph(coo, readonly=True, sort_csr=True) + g.ndata['id'] = F.arange(0, g.number_of_nodes()) + g.edata['id'] = F.tensor(etype_id, F.int64) + if args.pickle_graph: + with open(os.path.join(args.data_path, args.dataset, pickle_name), 'wb') as graph_file: + pickle.dump(g, graph_file) + self.g = g + + self.num_train = len(dataset.train) + self.num_valid = len(dataset.valid) + self.num_test = len(dataset.test) + + if args.eval_percent < 1: + self.valid = np.random.randint(0, self.num_valid, + size=(int(self.num_valid * args.eval_percent),)) + self.num_train + else: + self.valid = np.arange(self.num_train, self.num_train + self.num_valid) + print('|valid|:', len(self.valid)) + + if args.eval_percent < 1: + self.test = np.random.randint(0, self.num_test, + size=(int(self.num_test * args.eval_percent,))) + self.test += self.num_train + self.num_valid + else: + self.test = np.arange(self.num_train + self.num_valid, self.g.number_of_edges()) + print('|test|:', len(self.test)) + + self.num_valid = len(self.valid) + self.num_test = len(self.test) + + def get_edges(self, eval_type): + if eval_type == 'valid': + return self.valid + elif eval_type == 'test': + return self.test + else: + raise Exception('get invalid type: ' + eval_type) + + def check(self, eval_type): + edges = self.get_edges(eval_type) + subg = self.g.edge_subgraph(edges) + if eval_type == 'valid': + data = self.valid + elif eval_type == 'test': + data = self.test + + subg.copy_from_parent() + src, dst, eid = subg.all_edges('all', order='eid') + src_id = subg.ndata['id'][src] + dst_id = subg.ndata['id'][dst] + etype = subg.edata['id'][eid] + + orig_src = np.array([t[0] for t in data]) + orig_etype = np.array([t[1] for t in data]) + orig_dst = np.array([t[2] for t in data]) + np.testing.assert_equal(F.asnumpy(src_id), orig_src) + np.testing.assert_equal(F.asnumpy(dst_id), orig_dst) + np.testing.assert_equal(F.asnumpy(etype), orig_etype) + + def create_sampler(self, eval_type, batch_size, neg_sample_size, mode='head', + num_workers=5, rank=0, ranks=1): + edges = self.get_edges(eval_type) + beg = edges.shape[0] * rank // ranks + end = min(edges.shape[0] * (rank + 1) // ranks, edges.shape[0]) + edges = edges[beg: end] + print("eval on {} edges".format(len(edges))) + return EvalSampler(self.g, edges, batch_size, neg_sample_size, mode, num_workers) + +class NewBidirectionalOneShotIterator: + def __init__(self, dataloader_head, dataloader_tail, is_pbg, num_nodes): + self.sampler_head = dataloader_head + self.sampler_tail = dataloader_tail + self.iterator_head = self.one_shot_iterator(dataloader_head, is_pbg, + True, num_nodes) + self.iterator_tail = self.one_shot_iterator(dataloader_tail, is_pbg, + False, num_nodes) + self.step = 0 + + def __next__(self): + self.step += 1 + if self.step % 2 == 0: + pos_g, neg_g = next(self.iterator_head) + else: + pos_g, neg_g = next(self.iterator_tail) + return pos_g, neg_g + + @staticmethod + def one_shot_iterator(dataloader, is_pbg, neg_head, num_nodes): + while True: + for pos_g, neg_g in dataloader: + neg_g = create_neg_subgraph(pos_g, neg_g, is_pbg, neg_head, num_nodes) + if neg_g is None: + continue + + pos_g.copy_from_parent() + neg_g.copy_from_parent() + yield pos_g, neg_g diff --git a/sagemaker-python-sdk/dgl_kge/eval.py b/sagemaker-python-sdk/dgl_kge/eval.py new file mode 100644 index 0000000000..73e89e9153 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/eval.py @@ -0,0 +1,155 @@ +from dataloader import EvalDataset, TrainDataset +from dataloader import get_dataset + +import argparse +import torch.multiprocessing as mp +import os +import logging +import time +import pickle + +backend = os.environ.get('DGLBACKEND') +if backend.lower() == 'mxnet': + from train_mxnet import load_model_from_checkpoint + from train_mxnet import test +else: + from train_pytorch import load_model_from_checkpoint + from train_pytorch import test + +class ArgParser(argparse.ArgumentParser): + def __init__(self): + super(ArgParser, self).__init__() + + self.add_argument('--model_name', default='TransE', + choices=['TransE', 'TransH', 'TransR', 'TransD', + 'RESCAL', 'DistMult', 'ComplEx', 'RotatE', 'pRotatE'], + help='model to use') + self.add_argument('--data_path', type=str, default='data', + help='root path of all dataset') + self.add_argument('--dataset', type=str, default='FB15k', + help='dataset name, under data_path') + self.add_argument('--format', type=str, default='1', + help='the format of the dataset.') + self.add_argument('--model_path', type=str, default='ckpts', + help='the place where models are saved') + + self.add_argument('--batch_size', type=int, default=8, + help='batch size used for eval and test') + self.add_argument('--neg_sample_size', type=int, default=-1, + help='negative sampling size for testing') + self.add_argument('--hidden_dim', type=int, default=256, + help='hidden dim used by relation and entity') + self.add_argument('-g', '--gamma', type=float, default=12.0, + help='margin value') + self.add_argument('--eval_percent', type=float, default=1, + help='sample some percentage for evaluation.') + + self.add_argument('--gpu', type=int, default=-1, + help='use GPU') + self.add_argument('--mix_cpu_gpu', action='store_true', + help='mix CPU and GPU training') + self.add_argument('-de', '--double_ent', action='store_true', + help='double entitiy dim for complex number') + self.add_argument('-dr', '--double_rel', action='store_true', + help='double relation dim for complex number') + self.add_argument('--seed', type=int, default=0, + help='set random seed fro reproducibility') + + self.add_argument('--num_worker', type=int, default=16, + help='number of workers used for loading data') + self.add_argument('--num_proc', type=int, default=1, + help='number of process used') + + def parse_args(self): + args = super().parse_args() + return args + +def get_logger(args): + if not os.path.exists(args.model_path): + raise Exception('No existing model_path: ' + args.model_path) + + log_file = os.path.join(args.model_path, 'eval.log') + + logging.basicConfig( + format='%(asctime)s %(levelname)-8s %(message)s', + level=logging.INFO, + datefmt='%Y-%m-%d %H:%M:%S', + filename=log_file, + filemode='w' + ) + + logger = logging.getLogger(__name__) + print("Logs are being recorded at: {}".format(log_file)) + return logger + +def main(args): + # load dataset and samplers + dataset = get_dataset(args.data_path, args.dataset, args.format) + args.pickle_graph = False + args.train = False + args.valid = False + args.test = True + args.batch_size_eval = args.batch_size + + logger = get_logger(args) + # Here we want to use the regualr negative sampler because we need to ensure that + # all positive edges are excluded. + eval_dataset = EvalDataset(dataset, args) + args.neg_sample_size_test = args.neg_sample_size + if args.neg_sample_size < 0: + args.neg_sample_size_test = args.neg_sample_size = eval_dataset.g.number_of_nodes() + if args.num_proc > 1: + test_sampler_tails = [] + test_sampler_heads = [] + for i in range(args.num_proc): + test_sampler_head = eval_dataset.create_sampler('test', args.batch_size, + args.neg_sample_size, + mode='PBG-head', + num_workers=args.num_worker, + rank=i, ranks=args.num_proc) + test_sampler_tail = eval_dataset.create_sampler('test', args.batch_size, + args.neg_sample_size, + mode='PBG-tail', + num_workers=args.num_worker, + rank=i, ranks=args.num_proc) + test_sampler_heads.append(test_sampler_head) + test_sampler_tails.append(test_sampler_tail) + else: + test_sampler_head = eval_dataset.create_sampler('test', args.batch_size, + args.neg_sample_size, + mode='PBG-head', + num_workers=args.num_worker, + rank=0, ranks=1) + test_sampler_tail = eval_dataset.create_sampler('test', args.batch_size, + args.neg_sample_size, + mode='PBG-tail', + num_workers=args.num_worker, + rank=0, ranks=1) + + # load model + n_entities = dataset.n_entities + n_relations = dataset.n_relations + ckpt_path = args.model_path + model = load_model_from_checkpoint(logger, args, n_entities, n_relations, ckpt_path) + + if args.num_proc > 1: + model.share_memory() + # test + args.step = 0 + args.max_step = 0 + if args.num_proc > 1: + procs = [] + for i in range(args.num_proc): + proc = mp.Process(target=test, args=(args, model, [test_sampler_heads[i], test_sampler_tails[i]])) + procs.append(proc) + proc.start() + for proc in procs: + proc.join() + else: + test(args, model, [test_sampler_head, test_sampler_tail]) + + +if __name__ == '__main__': + args = ArgParser().parse_args() + main(args) + diff --git a/sagemaker-python-sdk/dgl_kge/kge_mxnet.ipynb b/sagemaker-python-sdk/dgl_kge/kge_mxnet.ipynb new file mode 100644 index 0000000000..17596365ff --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/kge_mxnet.ipynb @@ -0,0 +1,148 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training knowledge graph embedding by using the Deep Graph Library with MXNet backend\n", + "The **Amazon SageMaker Python SDK** makes it easy to train Deep Graph Library (DGL) models. In this example, you generate knowledge graph embedding using the [DMLC DGL API](https://github.com/dmlc/dgl.git) and FB15k dataset.\n", + "\n", + "For more information about knowledge graph embedding and this example, see https://github.com/dmlc/dgl/tree/master/apps/kg\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Setup\n", + "Define a few variables that are needed later in the example." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here if you wish.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# IAM execution role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# You can use the Amazon SageMaker Python SDK to get the role from the notebook environment. \n", + "role = get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Amazon SageMaker estimator class\n", + "The Amazon SageMaker estimator allows you to run a single machine in Amazon SageMaker, using CPU or GPU-based instances.\n", + "\n", + "When you create the estimator, pass in the file name of the training script and the name of the IAM execution role. Also provide a few other parameters. train_instance_count and train_instance_type determine the number and type of Amazon SageMaker instances that are used for the training job. The hyperparameters parameter is a dictionary of values that is passed to your training script as parameters that you can use argparse to parse.\n", + "\n", + "Here, use the official Docker image for this example. For more information, see https://docs.aws.amazon.com/dlami/latest/devguide/deep-learning-containers-images.html. You should get the latest mxnet-1.6.0-gpu-py3 image from official Amazon Elastic Container Registry (Amazon ECR) and push it into your own ECR." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.mxnet.estimator import MXNet\n", + "\n", + "ENTRY_POINT = 'train.py'\n", + "CODE_PATH = './'\n", + "\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "\n", + "docker_name = 'beta-mxnet-training' # change this for your own ECR image name\n", + "docker_tag = '1.6.0-py3-gpu-build' # change this for your own ECR image tag\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:{}'.format(account, region, docker_name, docker_tag)\n", + "print(image)\n", + "\n", + "params = {}\n", + "params['dataset'] = 'FB15k'\n", + "params['model'] = 'DistMult'\n", + "params['batch_size'] = 1024\n", + "params['neg_sample_size'] = 256\n", + "params['hidden_dim'] = 2000\n", + "params['gamma'] = 500.0\n", + "params['lr'] = 0.1\n", + "params['max_step'] = 100000\n", + "params['batch_size_eval'] = 16\n", + "params['valid'] = True\n", + "params['test'] = True\n", + "params['neg_adversarial_sampling'] = True\n", + "\n", + "estimator = MXNet(entry_point=ENTRY_POINT,\n", + " source_dir=CODE_PATH,\n", + " role=role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.p3.2xlarge',\n", + " image_name=image,\n", + " hyperparameters=params,\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Running the Training Job\n", + "After you construct the Estimator object, you can fit it by using Amazon SageMaker. The dataset is automatically downloaded." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator.fit()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Output\n", + "You can get the resulting embedding output from the Amazon SageMaker console by searching for the training task and looking for the address of 'S3 model artifact'" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_mxnet_p36", + "language": "python", + "name": "conda_mxnet_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_kge/kge_mxnet_hypertune.ipynb b/sagemaker-python-sdk/dgl_kge/kge_mxnet_hypertune.ipynb new file mode 100644 index 0000000000..e45345eaa8 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/kge_mxnet_hypertune.ipynb @@ -0,0 +1,267 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Hyperparameter tuning with Amazon SageMaker and Deep Graph Library with MXNet backend\n", + "_**Creating a Hyperparameter tuning job for a DGL network**_\n", + "___\n", + "___\n", + "\n", + "\n", + "## Contents\n", + "1. [Background](#Background) \n", + "2. [Setup](#Setup)\n", + "3. [Tune](#Train) \n", + "4. [Wrap-up](#Wrap-up) \n", + "\n", + "## Background\n", + "This example notebook shows how to generate knowledge graph embedding using the DMLC DGL API and FB15k dataset. It uses the Amazon SageMaker hyperparameter tuning to start multiple training jobs with different hyperparameter combinations. This helps you find the set with best model performance. This is an important step in the machine learning process as hyperparameter settings can have a large effect on model accuracy. In this example, you use the [Amazon SageMaker Python SDK](https://github.com/aws/sagemaker-python-sdk) to create a hyperparameter tuning job for an Amazon SageMaker estimator." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup\n", + "This notebook was created and tested on an ml.p3.2xlarge notebook instance.\n", + "\n", + "Prerequisites\n", + " * You can successfully run the kge_mxnet example (see kge_mxnet.ipynb).\n", + " * You have an S3 bucket and prefix that you want to use for training and model data. This should be within the same Region as the notebook instance, training, and hosting.\n", + " * You have the IAM role ARN used to give training and hosting access to your data. See the documentation for more details on creating these. If a role not associated with the current notebook instance, or more than one role, is required for training or hosting, replace sagemaker.get_execution_role() with the appropriate full IAM role ARN strings.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here if you wish.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# IAM execution role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# You can use the Amazon SageMaker Python SDK to get the role from the notebook environment. \n", + "role = sagemaker.get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we'll import the Python libraries we'll need." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "from sagemaker.tuner import IntegerParameter, CategoricalParameter, ContinuousParameter, HyperparameterTuner" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tune\n", + "Similar to training a single training job in Amazon SageMaker, you define the training estimator passing in the code scripts, IAM role, (per job) hardware configuration, and any hyperparameters you're not tuning." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.mxnet.estimator import MXNet\n", + "\n", + "ENTRY_POINT = 'train.py'\n", + "CODE_PATH = './'\n", + "\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "\n", + "docker_name = 'beta-mxnet-training' # change this for your own ECR image name\n", + "docker_tag = '1.6.0-py3-gpu-build' # change this for your own ECR image tag\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:{}'.format(account, region, docker_name, docker_tag)\n", + "print(image)\n", + "\n", + "params = {}\n", + "params['dataset'] = 'FB15k'\n", + "params['model'] = 'DistMult'\n", + "params['batch_size'] = 1024\n", + "params['neg_sample_size'] = 256\n", + "params['hidden_dim'] = 2000\n", + "params['max_step'] = 100000\n", + "params['batch_size_eval'] = 16\n", + "params['valid'] = True\n", + "params['test'] = True\n", + "params['neg_adversarial_sampling'] = True\n", + "\n", + "estimator = MXNet(entry_point=ENTRY_POINT,\n", + " source_dir=CODE_PATH,\n", + " role=role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.p3.2xlarge',\n", + " image_name=image,\n", + " hyperparameters=params,\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After you define your estimator, specify the hyperparameters you want to tune and their possible values. You have three different types of hyperparameters.\n", + " * Categorical parameters need to take one value from a discrete set. Define this by passing the list of possible values to CategoricalParameter(list)\n", + " * Continuous parameters can take any real number value between the minimum and maximum value, defined by ContinuousParameter(min, max)\n", + " * Integer parameters can take any integer value between the minimum and maximum value, defined by IntegerParameter(min, max)\n", + " \n", + "If possible, it's almost always best to specify a value as the least restrictive type. For example, tuning threshold as a continuous value between 0.01 and 0.2 is likely to yield a better result than tuning as a categorical parameter with possible values of 0.01, 0.1, 0.15, or 0.2." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hyperparameter_ranges = {'lr': ContinuousParameter(0.01, 0.1),\n", + " 'gamma': ContinuousParameter(400, 600)}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, specify the objective metric that you want to tune and its definition. This includes the regular expression needed to extract that metric from the Amazon CloudWatch logs of the training job.\n", + "\n", + "You can capture evalution results such as MR, MRR and Hit10." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "metric = []\n", + "mr_metric = {'Name': 'final_MR', 'Regex':\"Test average MR at \\[\\S*\\]: (\\S*)\"}\n", + "mrr_metric = {'Name': 'final_MRR', 'Regex':\"Test average MRR at \\[\\S*\\]: (\\S*)\"}\n", + "hit10_metric = {'Name': 'final_Hit10', 'Regex':\"Test average HITS@10 at \\[\\S*\\]: (\\S*)\"}\n", + "metric.append(mr_metric)\n", + "metric.append(mrr_metric)\n", + "metric.append(hit10_metric)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, create a HyperparameterTuner object, which you pass.\n", + "\n", + " * The training estimator you created above\n", + " * The hyperparameter ranges\n", + " * Objective metric name and definition\n", + " * Number of training jobs to run in-total and how many training jobs should be run simultaneously. More parallel jobs will finish tuning sooner, but may sacrifice accuracy. We recommend that you set the parallel jobs value to less than 10 percent of the total number of training jobs It's set it higher in this example to keep it short.\n", + " * Whether you should maximize or minimize the objective metric. You choose 'Minimize' in this example, which is what you want for the MR result." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tuner = HyperparameterTuner(estimator,\n", + " objective_metric_name='final_MR',\n", + " objective_type='Minimize',\n", + " hyperparameter_ranges=hyperparameter_ranges,\n", + " metric_definitions=metric,\n", + " max_jobs=6,\n", + " max_parallel_jobs=2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And finally, you can start the tuning job by calling .fit()." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tuner.fit()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Run a quick check of the hyperparameter tuning jobs status to make sure it started successfully and is InProgress." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "boto3.client('sagemaker').describe_hyper_parameter_tuning_job(\n", + " HyperParameterTuningJobName=tuner.latest_tuning_job.job_name)['HyperParameterTuningJobStatus']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Wrap-up\n", + "Now that we've started the hyperparameter tuning job, it will run in the background. You can close this notebook. When it's finished, you can go to console to analyze the result.\n", + "\n", + "For more information about Amazon SageMaker's Hyperparameter Tuning, see the AWS documentation." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_mxnet_p36", + "language": "python", + "name": "conda_mxnet_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_kge/kge_pytorch.ipynb b/sagemaker-python-sdk/dgl_kge/kge_pytorch.ipynb new file mode 100644 index 0000000000..6acb78821c --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/kge_pytorch.ipynb @@ -0,0 +1,148 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Training knowledge graph embedding by using the Deep Graph Library with PyTorch backend\n", + "The **Amazon SageMaker Python SDK** makes it easy to train Deep Graph Library (DGL) models. In this example, you generate knowledge graph embedding using the [DMLC DGL API](https://github.com/dmlc/dgl.git) and FB15k dataset.\n", + "\n", + "For more details about Knowledge Graph Embedding and this example, see https://github.com/dmlc/dgl/tree/master/apps/kg\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Setup\n", + "Define a few variables that are needed later in the example." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here if you wish.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# IAM execution role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# You can use the Amazon SageMaker Python SDK to get the role from the notebook environment. \n", + "role = get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### The Amazon SageMaker estimator class\n", + "The Amazon SageMaker estimator allows you to run single machine in Amazon SageMaker, using CPU or GPU-based instances.\n", + "\n", + "When you create the estimator, pass in the file name of the training script and the name of the IAM execution role. Also provide a few other parameters. train_instance_count and train_instance_type determine the number and type of Amazon SageMaker instances that are used for the training job. The hyperparameters parameter is a dictionary of values that is passed to your training script as parameters so that you can use argparse to parse them.\n", + "\n", + "Here, use the official Docker image for this example. For more information, see https://docs.aws.amazon.com/dlami/latest/devguide/deep-learning-containers-images.html. You should get the latest pytorch-1.3.1-gpu-py3 image from Amazon Elastic Container Registry (Amazon ECR) and push it into your own ECR." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.pytorch import PyTorch\n", + "\n", + "ENTRY_POINT = 'train.py'\n", + "CODE_PATH = './'\n", + "\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "\n", + "docker_name = 'beta-pytorch-training' # change this for your own ECR image name\n", + "docker_tag = '1.3.1-py3-gpu-with-horovod-build' # change this for your own ECR image tag\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:{}'.format(account, region, docker_name, docker_tag)\n", + "print(image)\n", + "\n", + "params = {}\n", + "params['dataset'] = 'FB15k'\n", + "params['model'] = 'DistMult'\n", + "params['batch_size'] = 1024\n", + "params['neg_sample_size'] = 256\n", + "params['hidden_dim'] = 2000\n", + "params['gamma'] = 500.0\n", + "params['lr'] = 0.1\n", + "params['max_step'] = 100000\n", + "params['batch_size_eval'] = 16\n", + "params['valid'] = True\n", + "params['test'] = True\n", + "params['neg_adversarial_sampling'] = True\n", + "\n", + "estimator = PyTorch(entry_point=ENTRY_POINT,\n", + " source_dir=CODE_PATH,\n", + " role=role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.p3.2xlarge',\n", + " image_name=image,\n", + " hyperparameters=params,\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Running the Training Job\n", + "After you construct the Estimator object, fit it by using Amazon SageMaker. The dataset is automatically downloaded." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "estimator.fit()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Output\n", + "You can get the resulting embedding output from the Amazon SageMaker console by searching for the training task and looking for the address of 'S3 model artifact'" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_dgl_py36_mxnet1.5", + "language": "python", + "name": "conda_dgl_py36_mxnet1.5" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_kge/kge_pytorch_hypertune.ipynb b/sagemaker-python-sdk/dgl_kge/kge_pytorch_hypertune.ipynb new file mode 100644 index 0000000000..ea32dc0fc4 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/kge_pytorch_hypertune.ipynb @@ -0,0 +1,273 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Hyperparameter tuning with Amazon SageMaker and Deep Graph Library with PyTorch backend\n", + "_**Creating a Hyperparameter tuning job for a DGL network**_\n", + "___\n", + "___\n", + "\n", + "\n", + "## Contents\n", + "1. [Background](#Background) \n", + "2. [Setup](#Setup) \n", + "3. [Tune](#Train) \n", + "4. [Wrap-up](#Wrap-up) \n", + "\n", + "## Background\n", + "This example notebook shows how to generate knowledge graph embedding using the DMLC DGL API and FB15k dataset. It uses the Amazon SageMaker hyperparameter tuning to start multiple training jobs with different hyperparameter combinations. This helps you find the set with best model performance. This is an important step in the machine learning process as hyperparameter settings can have a large effect on model accuracy. In this example, you use the [Amazon SageMaker Python SDK](https://github.com/aws/sagemaker-python-sdk) to create a hyperparameter tuning job for an Amazon SageMaker estimator." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup\n", + "This notebook was created and tested on an ml.p3.2xlarge notebook instance.\n", + "\n", + "Prerequisites\n", + " * You can successfully run the kge_pytorch example (see kge_pytorch.ipynb).\n", + " * You have an S3 bucket and prefix that you want to use for training and model data. This should be within the same Region as the notebook instance, training, and hosting.\n", + " * You have the IAM role ARN used to give training and hosting access to your data. See the documentation for more details on creating these. If a role not associated with the current notebook instance, or more than one role, is required for training or hosting, replace sagemaker.get_execution_role() with the appropriate full IAM role ARN strings.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.session import Session\n", + "\n", + "# Setup session\n", + "sess = sagemaker.Session()\n", + "\n", + "# S3 bucket for saving code and model artifacts.\n", + "# Feel free to specify a different bucket here if you wish.\n", + "bucket = sess.default_bucket()\n", + "\n", + "# Location to put your custom code.\n", + "custom_code_upload_location = 'customcode'\n", + "\n", + "# IAM execution role that gives Amazon SageMaker access to resources in your AWS account.\n", + "# You can use the Amazon SageMaker Python SDK to get the role from the notebook environment. \n", + "role = get_execution_role()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now we'll import the Python libraries we'll need." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "from sagemaker.tuner import IntegerParameter, CategoricalParameter, ContinuousParameter, HyperparameterTuner" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Tune\n", + "Similar to training a single training job in Amazon SageMaker, you define the training estimator passing in the code scripts, IAM role, (per job) hardware configuration, and any hyperparameters you're not tuning." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.pytorch import PyTorch\n", + "\n", + "ENTRY_POINT = 'train.py'\n", + "CODE_PATH = './'\n", + "\n", + "account = sess.boto_session.client('sts').get_caller_identity()['Account']\n", + "region = sess.boto_session.region_name\n", + "\n", + "docker_name = 'beta-pytorch-training' # change this for your own ECR image name\n", + "docker_tag = '1.3.1-py3-gpu-with-horovod-build' # change this for your own ECR image tag\n", + "image = '{}.dkr.ecr.{}.amazonaws.com/{}:{}'.format(account, region, docker_name, docker_tag)\n", + "print(image)\n", + "\n", + "params = {}\n", + "params['dataset'] = 'FB15k'\n", + "params['model'] = 'DistMult'\n", + "params['batch_size'] = 1024\n", + "params['neg_sample_size'] = 256\n", + "params['hidden_dim'] = 2000\n", + "params['max_step'] = 100000\n", + "params['batch_size_eval'] = 16\n", + "params['valid'] = True\n", + "params['test'] = True\n", + "params['neg_adversarial_sampling'] = True\n", + "\n", + "estimator = PyTorch(entry_point=ENTRY_POINT,\n", + " source_dir=CODE_PATH,\n", + " role=role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.p3.2xlarge',\n", + " image_name=image,\n", + " hyperparameters=params,\n", + " sagemaker_session=sess)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After you define your estimator, specify the hyperparameters you want to tune and their possible values. You have three different types of hyperparameters.\n", + " * Categorical parameters need to take one value from a discrete set. Define this by passing the list of possible values to CategoricalParameter(list)\n", + " * Continuous parameters can take any real number value between the minimum and maximum value, defined by ContinuousParameter(min, max)\n", + " * Integer parameters can take any integer value between the minimum and maximum value, defined by IntegerParameter(min, max)\n", + " \n", + "If possible, it's almost always best to specify a value as the least restrictive type. For example, tuning threshold as a continuous value between 0.01 and 0.2 is likely to yield a better result than tuning as a categorical parameter with possible values of 0.01, 0.1, 0.15, or 0.2." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "hyperparameter_ranges = {'lr': ContinuousParameter(0.01, 0.1),\n", + " 'gamma': ContinuousParameter(400, 600)}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, specify the objective metric that you want to tune and its definition. This includes the regular expression needed to extract that metric from the Amazon CloudWatch logs of the training job.\n", + "\n", + "You can capture evalution results such as MR, MRR and Hit10." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "metric = []\n", + "mr_metric = {'Name': 'final_MR', 'Regex':\"Test average MR at \\[\\S*\\]: (\\S*)\"}\n", + "mrr_metric = {'Name': 'final_MRR', 'Regex':\"Test average MRR at \\[\\S*\\]: (\\S*)\"}\n", + "hit10_metric = {'Name': 'final_Hit10', 'Regex':\"Test average HITS@10 at \\[\\S*\\]: (\\S*)\"}\n", + "metric.append(mr_metric)\n", + "metric.append(mrr_metric)\n", + "metric.append(hit10_metric)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now, create a HyperparameterTuner object, which you pass.\n", + "\n", + " * The training estimator you created above\n", + " * The hyperparameter ranges\n", + " * Objective metric name and definition\n", + " * Number of training jobs to run in-total and how many training jobs should be run simultaneously. More parallel jobs will finish tuning sooner, but may sacrifice accuracy. We recommend that you set the parallel jobs value to less than 10 percent of the total number of training jobs It's set it higher in this example to keep it short.\n", + " * Whether you should maximize or minimize the objective metric. You choose 'Minimize' in this example, which is what you want for the MR result." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tuner = HyperparameterTuner(estimator,\n", + " objective_metric_name='final_MR',\n", + " objective_type='Minimize',\n", + " hyperparameter_ranges=hyperparameter_ranges,\n", + " metric_definitions=metric,\n", + " max_jobs=6,\n", + " max_parallel_jobs=2)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "And finally, you can start the tuning job by calling .fit()." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "tuner.fit()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Run a quick check of the hyperparameter tuning jobs status to make sure it started successfully and is InProgress." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "boto3.client('sagemaker').describe_hyper_parameter_tuning_job(\n", + " HyperParameterTuningJobName=tuner.latest_tuning_job.job_name)['HyperParameterTuningJobStatus']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Wrap-up\n", + "Now that we've started the hyperparameter tuning job, it will run in the background. You can close this notebook. When it's finished, you can go to console to analyze the result.\n", + "\n", + "For more information about Amazon SageMaker's Hyperparameter Tuning, see the AWS documentation." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_pytorch_p36", + "language": "python", + "name": "conda_pytorch_p36" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/sagemaker-python-sdk/dgl_kge/models/__init__.py b/sagemaker-python-sdk/dgl_kge/models/__init__.py new file mode 100644 index 0000000000..14c56f7c64 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/models/__init__.py @@ -0,0 +1 @@ +from .general_models import KEModel diff --git a/sagemaker-python-sdk/dgl_kge/models/general_models.py b/sagemaker-python-sdk/dgl_kge/models/general_models.py new file mode 100644 index 0000000000..4b79621ca6 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/models/general_models.py @@ -0,0 +1,213 @@ +import os +import numpy as np +import dgl.backend as F + +backend = os.environ.get('DGLBACKEND') +if backend.lower() == 'mxnet': + from .mxnet.tensor_models import logsigmoid + from .mxnet.tensor_models import get_device + from .mxnet.tensor_models import norm + from .mxnet.tensor_models import get_scalar + from .mxnet.tensor_models import reshape + from .mxnet.tensor_models import cuda + from .mxnet.tensor_models import ExternalEmbedding + from .mxnet.score_fun import * +else: + from .pytorch.tensor_models import logsigmoid + from .pytorch.tensor_models import get_device + from .pytorch.tensor_models import norm + from .pytorch.tensor_models import get_scalar + from .pytorch.tensor_models import reshape + from .pytorch.tensor_models import cuda + from .pytorch.tensor_models import ExternalEmbedding + from .pytorch.score_fun import * + +class KEModel(object): + def __init__(self, args, model_name, n_entities, n_relations, hidden_dim, gamma, + double_entity_emb=False, double_relation_emb=False): + super(KEModel, self).__init__() + self.args = args + self.n_entities = n_entities + self.model_name = model_name + self.hidden_dim = hidden_dim + self.eps = 2.0 + self.emb_init = (gamma + self.eps) / hidden_dim + + entity_dim = 2 * hidden_dim if double_entity_emb else hidden_dim + relation_dim = 2 * hidden_dim if double_relation_emb else hidden_dim + + device = get_device(args) + self.entity_emb = ExternalEmbedding(args, n_entities, entity_dim, + F.cpu() if args.mix_cpu_gpu else device) + # For RESCAL, relation_emb = relation_dim * entity_dim + if model_name == 'RESCAL': + rel_dim = relation_dim * entity_dim + else: + rel_dim = relation_dim + self.relation_emb = ExternalEmbedding(args, n_relations, rel_dim, device) + + if model_name == 'TransE': + self.score_func = TransEScore(gamma) + elif model_name == 'TransR': + projection_emb = ExternalEmbedding(args, n_relations, entity_dim * relation_dim, + F.cpu() if args.mix_cpu_gpu else device) + self.score_func = TransRScore(gamma, projection_emb, relation_dim, entity_dim) + elif model_name == 'DistMult': + self.score_func = DistMultScore() + elif model_name == 'ComplEx': + self.score_func = ComplExScore() + elif model_name == 'RESCAL': + self.score_func = RESCALScore(relation_dim, entity_dim) + elif model_name == 'RotatE': + self.score_func = RotatEScore(gamma, self.emb_init) + + self.head_neg_score = self.score_func.create_neg(True) + self.tail_neg_score = self.score_func.create_neg(False) + self.head_neg_prepare = self.score_func.create_neg_prepare(True) + self.tail_neg_prepare = self.score_func.create_neg_prepare(False) + + self.reset_parameters() + + def share_memory(self): + # TODO(zhengda) we should make it work for parameters in score func + self.entity_emb.share_memory() + self.relation_emb.share_memory() + + def save_emb(self, path, dataset): + self.entity_emb.save(path, dataset+'_'+self.model_name+'_entity') + self.relation_emb.save(path, dataset+'_'+self.model_name+'_relation') + self.score_func.save(path, dataset+'_'+self.model_name) + + def load_emb(self, path, dataset): + self.entity_emb.load(path, dataset+'_'+self.model_name+'_entity') + self.relation_emb.load(path, dataset+'_'+self.model_name+'_relation') + self.score_func.load(path, dataset+'_'+self.model_name) + + def reset_parameters(self): + self.entity_emb.init(self.emb_init) + self.relation_emb.init(self.emb_init) + self.score_func.reset_parameters() + + def predict_score(self, g): + self.score_func(g) + return g.edata['score'] + + def predict_neg_score(self, pos_g, neg_g, to_device=None, gpu_id=-1, trace=False): + num_chunks = neg_g.num_chunks + chunk_size = neg_g.chunk_size + neg_sample_size = neg_g.neg_sample_size + if neg_g.neg_head: + neg_head_ids = neg_g.ndata['id'][neg_g.head_nid] + neg_head = self.entity_emb(neg_head_ids, gpu_id, trace) + _, tail_ids = pos_g.all_edges(order='eid') + if to_device is not None and gpu_id >= 0: + tail_ids = to_device(tail_ids, gpu_id) + tail = pos_g.ndata['emb'][tail_ids] + rel = pos_g.edata['emb'] + + neg_head, tail = self.head_neg_prepare(pos_g.edata['id'], num_chunks, neg_head, tail, gpu_id, trace) + neg_score = self.head_neg_score(neg_head, rel, tail, + num_chunks, chunk_size, neg_sample_size) + else: + neg_tail_ids = neg_g.ndata['id'][neg_g.tail_nid] + neg_tail = self.entity_emb(neg_tail_ids, gpu_id, trace) + head_ids, _ = pos_g.all_edges(order='eid') + if to_device is not None and gpu_id >= 0: + head_ids = to_device(head_ids, gpu_id) + head = pos_g.ndata['emb'][head_ids] + rel = pos_g.edata['emb'] + + head, neg_tail = self.tail_neg_prepare(pos_g.edata['id'], num_chunks, head, neg_tail, gpu_id, trace) + neg_score = self.tail_neg_score(head, rel, neg_tail, + num_chunks, chunk_size, neg_sample_size) + + return neg_score + + def forward_test(self, pos_g, neg_g, logs, gpu_id=-1): + pos_g.ndata['emb'] = self.entity_emb(pos_g.ndata['id'], gpu_id, False) + pos_g.edata['emb'] = self.relation_emb(pos_g.edata['id'], gpu_id, False) + + self.score_func.prepare(pos_g, gpu_id, False) + + batch_size = pos_g.number_of_edges() + pos_scores = self.predict_score(pos_g) + pos_scores = reshape(logsigmoid(pos_scores), batch_size, -1) + + neg_scores = self.predict_neg_score(pos_g, neg_g, to_device=cuda, + gpu_id=gpu_id, trace=False) + neg_scores = reshape(logsigmoid(neg_scores), batch_size, -1) + + # We need to filter the positive edges in the negative graph. + filter_bias = reshape(neg_g.edata['bias'], batch_size, -1) + if self.args.gpu >= 0: + filter_bias = cuda(filter_bias, self.args.gpu) + neg_scores += filter_bias + # To compute the rank of a positive edge among all negative edges, + # we need to know how many negative edges have higher scores than + # the positive edge. + rankings = F.sum(neg_scores > pos_scores, dim=1) + 1 + rankings = F.asnumpy(rankings) + for i in range(batch_size): + ranking = rankings[i] + logs.append({ + 'MRR': 1.0 / ranking, + 'MR': float(ranking), + 'HITS@1': 1.0 if ranking <= 1 else 0.0, + 'HITS@3': 1.0 if ranking <= 3 else 0.0, + 'HITS@10': 1.0 if ranking <= 10 else 0.0 + }) + + # @profile + def forward(self, pos_g, neg_g, gpu_id=-1): + pos_g.ndata['emb'] = self.entity_emb(pos_g.ndata['id'], gpu_id, True) + pos_g.edata['emb'] = self.relation_emb(pos_g.edata['id'], gpu_id, True) + + self.score_func.prepare(pos_g, gpu_id, True) + + pos_score = self.predict_score(pos_g) + pos_score = logsigmoid(pos_score) + if gpu_id >= 0: + neg_score = self.predict_neg_score(pos_g, neg_g, to_device=cuda, + gpu_id=gpu_id, trace=True) + else: + neg_score = self.predict_neg_score(pos_g, neg_g, trace=True) + + neg_score = reshape(neg_score, -1, neg_g.neg_sample_size) + # Adversarial sampling + if self.args.neg_adversarial_sampling: + neg_score = F.sum(F.softmax(neg_score * self.args.adversarial_temperature, dim=1).detach() + * logsigmoid(-neg_score), dim=1) + else: + neg_score = F.mean(logsigmoid(-neg_score), dim=1) + + # subsampling weight + # TODO: add subsampling to new sampler + if self.args.non_uni_weight: + subsampling_weight = pos_g.edata['weight'] + pos_score = (pos_score * subsampling_weight).sum() / subsampling_weight.sum() + neg_score = (neg_score * subsampling_weight).sum() / subsampling_weight.sum() + else: + pos_score = pos_score.mean() + neg_score = neg_score.mean() + + # compute loss + loss = -(pos_score + neg_score) / 2 + + log = {'pos_loss': - get_scalar(pos_score), + 'neg_loss': - get_scalar(neg_score), + 'loss': get_scalar(loss)} + + # regularization: TODO(zihao) + #TODO: only reg ent&rel embeddings. other params to be added. + if self.args.regularization_coef > 0.0 and self.args.regularization_norm > 0: + coef, nm = self.args.regularization_coef, self.args.regularization_norm + reg = coef * (norm(self.entity_emb.curr_emb(), nm) + norm(self.relation_emb.curr_emb(), nm)) + log['regularization'] = get_scalar(reg) + loss = loss + reg + + return loss, log + + def update(self): + self.entity_emb.update() + self.relation_emb.update() + self.score_func.update() diff --git a/sagemaker-python-sdk/dgl_kge/models/mxnet/__init__.py b/sagemaker-python-sdk/dgl_kge/models/mxnet/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/sagemaker-python-sdk/dgl_kge/models/mxnet/score_fun.py b/sagemaker-python-sdk/dgl_kge/models/mxnet/score_fun.py new file mode 100644 index 0000000000..7037c6dcf9 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/models/mxnet/score_fun.py @@ -0,0 +1,451 @@ +import numpy as np +import mxnet as mx +from mxnet import gluon +from mxnet.gluon import nn +from mxnet import ndarray as nd + +class TransEScore(nn.Block): + def __init__(self, gamma): + super(TransEScore, self).__init__() + self.gamma = gamma + + def edge_func(self, edges): + head = edges.src['emb'] + tail = edges.dst['emb'] + rel = edges.data['emb'] + score = head + rel - tail + return {'score': self.gamma - nd.norm(score, ord=1, axis=-1)} + + def prepare(self, g, gpu_id, trace=False): + pass + + def create_neg_prepare(self, neg_head): + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + return head, tail + return fn + + def update(self): + pass + + def reset_parameters(self): + pass + + def save(self, path, name): + pass + + def load(self, path, name): + pass + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def create_neg(self, neg_head): + gamma = self.gamma + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + heads = heads.reshape(num_chunks, 1, neg_sample_size, hidden_dim) + tails = tails - relations + tails = tails.reshape(num_chunks,chunk_size, 1, hidden_dim) + return gamma - nd.norm(heads - tails, ord=1, axis=-1) + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + heads = heads + relations + heads = heads.reshape(num_chunks, chunk_size, 1, hidden_dim) + tails = tails.reshape(num_chunks, 1, neg_sample_size, hidden_dim) + return gamma - nd.norm(heads - tails, ord=1, axis=-1) + return fn + +class TransRScore(nn.Block): + def __init__(self, gamma, projection_emb, relation_dim, entity_dim): + super(TransRScore, self).__init__() + self.gamma = gamma + self.projection_emb = projection_emb + self.relation_dim = relation_dim + self.entity_dim = entity_dim + + def edge_func(self, edges): + head = edges.data['head_emb'] + tail = edges.data['tail_emb'] + rel = edges.data['emb'] + score = head + rel - tail + return {'score': self.gamma - nd.norm(score, ord=1, axis=-1)} + + def prepare(self, g, gpu_id, trace=False): + head_ids, tail_ids = g.all_edges(order='eid') + projection = self.projection_emb(g.edata['id'], gpu_id, trace) + projection = projection.reshape(-1, self.entity_dim, self.relation_dim) + head_emb = g.ndata['emb'][head_ids.as_in_context(g.ndata['emb'].context)].expand_dims(axis=-2) + tail_emb = g.ndata['emb'][tail_ids.as_in_context(g.ndata['emb'].context)].expand_dims(axis=-2) + g.edata['head_emb'] = nd.batch_dot(head_emb, projection).squeeze() + g.edata['tail_emb'] = nd.batch_dot(tail_emb, projection).squeeze() + + def create_neg_prepare(self, neg_head): + if neg_head: + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + # pos node, project to its relation + projection = self.projection_emb(rel_id, gpu_id, trace) + projection = projection.reshape(-1, self.entity_dim, self.relation_dim) + tail = tail.reshape(-1, 1, self.entity_dim) + tail = nd.batch_dot(tail, projection) + tail = tail.reshape(num_chunks, -1, self.relation_dim) + + # neg node, each project to all relations + projection = projection.reshape(num_chunks, -1, self.entity_dim, self.relation_dim) + head = head.reshape(num_chunks, -1, 1, self.entity_dim) + num_rels = projection.shape[1] + num_nnodes = head.shape[1] + + heads = [] + for i in range(num_chunks): + head_negs = [] + for j in range(num_nnodes): + head_neg = head[i][j] + head_neg = head_neg.reshape(1, 1, self.entity_dim) + head_neg = nd.broadcast_axis(head_neg, axis=0, size=num_rels) + head_neg = nd.batch_dot(head_neg, projection[i]) + head_neg = head_neg.squeeze(axis=1) + head_negs.append(head_neg) + head_negs = nd.stack(*head_negs, axis=1) + heads.append(head_negs) + head = nd.stack(*heads) + return head, tail + return fn + else: + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + # pos node, project to its relation + projection = self.projection_emb(rel_id, gpu_id, trace) + projection = projection.reshape(-1, self.entity_dim, self.relation_dim) + head = head.reshape(-1, 1, self.entity_dim) + head = nd.batch_dot(head, projection).squeeze() + head = head.reshape(num_chunks, -1, self.relation_dim) + + projection = projection.reshape(num_chunks, -1, self.entity_dim, self.relation_dim) + tail = tail.reshape(num_chunks, -1, 1, self.entity_dim) + num_rels = projection.shape[1] + num_nnodes = tail.shape[1] + + tails = [] + for i in range(num_chunks): + tail_negs = [] + for j in range(num_nnodes): + tail_neg = tail[i][j] + tail_neg = tail_neg.reshape(1, 1, self.entity_dim) + tail_neg = nd.broadcast_axis(tail_neg, axis=0, size=num_rels) + tail_neg = nd.batch_dot(tail_neg, projection[i]) + tail_neg = tail_neg.squeeze(axis=1) + tail_negs.append(tail_neg) + tail_negs = nd.stack(*tail_negs, axis=1) + tails.append(tail_negs) + tail = nd.stack(*tails) + return head, tail + return fn + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def reset_parameters(self): + self.projection_emb.init(1.0) + + def update(self): + self.projection_emb.update() + + def save(self, path, name): + self.projection_emb.save(path, name+'projection') + + def load(self, path, name): + self.projection_emb.load(path, name+'projection') + + def create_neg(self, neg_head): + gamma = self.gamma + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + relations = relations.reshape(num_chunks, -1, self.relation_dim) + tails = tails - relations + tails = tails.reshape(num_chunks, -1, 1, self.relation_dim) + score = heads - tails + return gamma - nd.norm(score, ord=1, axis=-1) + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + relations = relations.reshape(num_chunks, -1, self.relation_dim) + heads = heads - relations + heads = heads.reshape(num_chunks, -1, 1, self.relation_dim) + score = heads - tails + return gamma - nd.norm(score, ord=1, axis=-1) + return fn + +class DistMultScore(nn.Block): + def __init__(self): + super(DistMultScore, self).__init__() + + def edge_func(self, edges): + head = edges.src['emb'] + tail = edges.dst['emb'] + rel = edges.data['emb'] + score = head * rel * tail + # TODO: check if there exists minus sign and if gamma should be used here(jin) + return {'score': nd.sum(score, axis=-1)} + + def prepare(self, g, gpu_id, trace=False): + pass + + def create_neg_prepare(self, neg_head): + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + return head, tail + return fn + + def update(self): + pass + + def reset_parameters(self): + pass + + def save(self, path, name): + pass + + def load(self, path, name): + pass + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def create_neg(self, neg_head): + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + heads = heads.reshape(num_chunks, neg_sample_size, hidden_dim) + heads = nd.transpose(heads, axes=(0, 2, 1)) + tmp = (tails * relations).reshape(num_chunks, chunk_size, hidden_dim) + return nd.linalg_gemm2(tmp, heads) + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + tails = tails.reshape(num_chunks, neg_sample_size, hidden_dim) + tails = nd.transpose(tails, axes=(0, 2, 1)) + tmp = (heads * relations).reshape(num_chunks, chunk_size, hidden_dim) + return nd.linalg_gemm2(tmp, tails) + return fn + +class ComplExScore(nn.Block): + def __init__(self): + super(ComplExScore, self).__init__() + + def edge_func(self, edges): + real_head, img_head = nd.split(edges.src['emb'], num_outputs=2, axis=-1) + real_tail, img_tail = nd.split(edges.dst['emb'], num_outputs=2, axis=-1) + real_rel, img_rel = nd.split(edges.data['emb'], num_outputs=2, axis=-1) + + score = real_head * real_tail * real_rel \ + + img_head * img_tail * real_rel \ + + real_head * img_tail * img_rel \ + - img_head * real_tail * img_rel + # TODO: check if there exists minus sign and if gamma should be used here(jin) + return {'score': nd.sum(score, -1)} + + def prepare(self, g, gpu_id, trace=False): + pass + + def create_neg_prepare(self, neg_head): + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + return head, tail + return fn + + def update(self): + pass + + def reset_parameters(self): + pass + + def save(self, path, name): + pass + + def load(self, path, name): + pass + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def create_neg(self, neg_head): + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + emb_real, emb_img = nd.split(tails, num_outputs=2, axis=-1) + rel_real, rel_img = nd.split(relations, num_outputs=2, axis=-1) + real = emb_real * rel_real + emb_img * rel_img + img = -emb_real * rel_img + emb_img * rel_real + emb_complex = nd.concat(real, img, dim=-1) + tmp = emb_complex.reshape(num_chunks, chunk_size, hidden_dim) + heads = heads.reshape(num_chunks, neg_sample_size, hidden_dim) + heads = nd.transpose(heads, axes=(0, 2, 1)) + return nd.linalg_gemm2(tmp, heads) + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + emb_real, emb_img = nd.split(heads, num_outputs=2, axis=-1) + rel_real, rel_img = nd.split(relations, num_outputs=2, axis=-1) + real = emb_real * rel_real - emb_img * rel_img + img = emb_real * rel_img + emb_img * rel_real + emb_complex = nd.concat(real, img, dim=-1) + tmp = emb_complex.reshape(num_chunks, chunk_size, hidden_dim) + + tails = tails.reshape(num_chunks, neg_sample_size, hidden_dim) + tails = nd.transpose(tails, axes=(0, 2, 1)) + return nd.linalg_gemm2(tmp, tails) + return fn + +class RESCALScore(nn.Block): + def __init__(self, relation_dim, entity_dim): + super(RESCALScore, self).__init__() + self.relation_dim = relation_dim + self.entity_dim = entity_dim + + def edge_func(self, edges): + head = edges.src['emb'] + tail = edges.dst['emb'].expand_dims(2) + rel = edges.data['emb'] + rel = rel.reshape(-1, self.relation_dim, self.entity_dim) + score = head * mx.nd.batch_dot(rel, tail).squeeze() + # TODO: check if use self.gamma + return {'score': mx.nd.sum(score, -1)} + # return {'score': self.gamma - th.norm(score, p=1, dim=-1)} + + def prepare(self, g, gpu_id, trace=False): + pass + + def create_neg_prepare(self, neg_head): + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + return head, tail + return fn + + def update(self): + pass + + def reset_parameters(self): + pass + + def save(self, path, name): + pass + + def load(self, path, name): + pass + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def create_neg(self, neg_head): + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + heads = heads.reshape(num_chunks, neg_sample_size, hidden_dim) + heads = mx.nd.transpose(heads, axes=(0,2,1)) + tails = tails.expand_dims(2) + relations = relations.reshape(-1, self.relation_dim, self.entity_dim) + tmp = mx.nd.batch_dot(relations, tails).squeeze() + tmp = tmp.reshape(num_chunks, chunk_size, hidden_dim) + return nd.linalg_gemm2(tmp, heads) + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + tails = tails.reshape(num_chunks, neg_sample_size, hidden_dim) + tails = mx.nd.transpose(tails, axes=(0,2,1)) + heads = heads.expand_dims(2) + relations = relations.reshape(-1, self.relation_dim, self.entity_dim) + tmp = mx.nd.batch_dot(relations, heads).squeeze() + tmp = tmp.reshape(num_chunks, chunk_size, hidden_dim) + return nd.linalg_gemm2(tmp, tails) + return fn + +class RotatEScore(nn.Block): + def __init__(self, gamma, emb_init, eps=1e-10): + super(RotatEScore, self).__init__() + self.gamma = gamma + self.emb_init = emb_init + self.eps = eps + + def edge_func(self, edges): + real_head, img_head = nd.split(edges.src['emb'], num_outputs=2, axis=-1) + real_tail, img_tail = nd.split(edges.dst['emb'], num_outputs=2, axis=-1) + + phase_rel = edges.data['emb'] / (self.emb_init / np.pi) + re_rel, im_rel = nd.cos(phase_rel), nd.sin(phase_rel) + real_score = real_head * re_rel - img_head * im_rel + img_score = real_head * im_rel + img_head * re_rel + real_score = real_score - real_tail + img_score = img_score - img_tail + #sqrt((x*x).sum() + eps) + score = mx.nd.sqrt(real_score * real_score + img_score * img_score + self.eps).sum(-1) + return {'score': self.gamma - score} + + def prepare(self, g, gpu_id, trace=False): + pass + + def create_neg_prepare(self, neg_head): + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + return head, tail + return fn + + def update(self): + pass + + def reset_parameters(self): + pass + + def save(self, path, name): + pass + + def load(self, path, name): + pass + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def create_neg(self, neg_head): + gamma = self.gamma + emb_init = self.emb_init + eps = self.eps + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + emb_real, emb_img = nd.split(tails, num_outputs=2, axis=-1) + phase_rel = relations / (emb_init / np.pi) + + rel_real, rel_img = nd.cos(phase_rel), nd.sin(phase_rel) + real = emb_real * rel_real + emb_img * rel_img + img = -emb_real * rel_img + emb_img * rel_real + emb_complex = nd.concat(real, img, dim=-1) + tmp = emb_complex.reshape(num_chunks, chunk_size, 1, hidden_dim) + heads = heads.reshape(num_chunks, 1, neg_sample_size, hidden_dim) + + score = tmp - heads + score_real, score_img = nd.split(score, num_outputs=2, axis=-1) + score = mx.nd.sqrt(score_real * score_real + score_img * score_img + self.eps).sum(-1) + + return gamma - score + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + emb_real, emb_img = nd.split(heads, num_outputs=2, axis=-1) + phase_rel = relations / (emb_init / np.pi) + + rel_real, rel_img = nd.cos(phase_rel), nd.sin(phase_rel) + real = emb_real * rel_real - emb_img * rel_img + img = emb_real * rel_img + emb_img * rel_real + emb_complex = nd.concat(real, img, dim=-1) + tmp = emb_complex.reshape(num_chunks, chunk_size, 1, hidden_dim) + tails = tails.reshape(num_chunks, 1, neg_sample_size, hidden_dim) + + score = tmp - tails + score_real, score_img = nd.split(score, num_outputs=2, axis=-1) + score = mx.nd.sqrt(score_real * score_real + score_img * score_img + self.eps).sum(-1) + + return gamma - score + return fn + + diff --git a/sagemaker-python-sdk/dgl_kge/models/mxnet/tensor_models.py b/sagemaker-python-sdk/dgl_kge/models/mxnet/tensor_models.py new file mode 100644 index 0000000000..04e71e3877 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/models/mxnet/tensor_models.py @@ -0,0 +1,94 @@ +import os +import numpy as np +import mxnet as mx +from mxnet import gluon +from mxnet import ndarray as nd + +from .score_fun import * +from .. import * + +def logsigmoid(val): + max_elem = nd.maximum(0., -val) + z = nd.exp(-max_elem) + nd.exp(-val - max_elem) + return -(max_elem + nd.log(z)) + +get_device = lambda args : mx.gpu(args.gpu) if args.gpu >= 0 else mx.cpu() +norm = lambda x, p: nd.sum(nd.abs(x) ** p) + +get_scalar = lambda x: x.detach().asscalar() + +reshape = lambda arr, x, y: arr.reshape(x, y) + +cuda = lambda arr, gpu: arr.as_in_context(mx.gpu(gpu)) + +class ExternalEmbedding: + def __init__(self, args, num, dim, ctx): + self.gpu = args.gpu + self.args = args + self.trace = [] + + self.emb = nd.empty((num, dim), dtype=np.float32, ctx=ctx) + self.state_sum = nd.zeros((self.emb.shape[0]), dtype=np.float32, ctx=ctx) + self.state_step = 0 + + def init(self, emb_init): + nd.random.uniform(-emb_init, emb_init, + shape=self.emb.shape, dtype=self.emb.dtype, + ctx=self.emb.context, out=self.emb) + + def share_memory(self): + # TODO(zhengda) fix this later + pass + + def __call__(self, idx, gpu_id=-1, trace=True): + if self.emb.context != idx.context: + idx = idx.as_in_context(self.emb.context) + data = nd.take(self.emb, idx) + if self.gpu >= 0: + data = data.as_in_context(mx.gpu(self.gpu)) + data.attach_grad() + if trace: + self.trace.append((idx, data)) + return data + + def update(self): + self.state_step += 1 + for idx, data in self.trace: + grad = data.grad + + clr = self.args.lr + #clr = self.args.lr / (1 + (self.state_step - 1) * group['lr_decay']) + + # the update is non-linear so indices must be unique + grad_indices = idx + grad_values = grad + + grad_sum = (grad_values * grad_values).mean(1) + ctx = self.state_sum.context + if ctx != grad_indices.context: + grad_indices = grad_indices.as_in_context(ctx) + if ctx != grad_sum.context: + grad_sum = grad_sum.as_in_context(ctx) + self.state_sum[grad_indices] += grad_sum + std = self.state_sum[grad_indices] # _sparse_mask + std_values = nd.expand_dims(nd.sqrt(std) + 1e-10, 1) + if self.gpu >= 0: + std_values = std_values.as_in_context(mx.gpu(self.args.gpu)) + tmp = (-clr * grad_values / std_values) + if tmp.context != ctx: + tmp = tmp.as_in_context(ctx) + # TODO(zhengda) the overhead is here. + self.emb[grad_indices] = mx.nd.take(self.emb, grad_indices) + tmp + self.trace = [] + + def curr_emb(self): + data = [data for _, data in self.trace] + return nd.concat(*data, dim=0) + + def save(self, path, name): + emb_fname = os.path.join(path, name+'.npy') + np.save(emb_fname, self.emb.asnumpy()) + + def load(self, path, name): + emb_fname = os.path.join(path, name+'.npy') + self.emb = nd.array(np.load(emb_fname)) diff --git a/sagemaker-python-sdk/dgl_kge/models/pytorch/__init__.py b/sagemaker-python-sdk/dgl_kge/models/pytorch/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/sagemaker-python-sdk/dgl_kge/models/pytorch/score_fun.py b/sagemaker-python-sdk/dgl_kge/models/pytorch/score_fun.py new file mode 100644 index 0000000000..804a724908 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/models/pytorch/score_fun.py @@ -0,0 +1,421 @@ +import torch as th +import torch.nn as nn +import torch.nn.functional as functional +import torch.nn.init as INIT +import numpy as np + +class TransEScore(nn.Module): + def __init__(self, gamma): + super(TransEScore, self).__init__() + self.gamma = gamma + + def edge_func(self, edges): + head = edges.src['emb'] + tail = edges.dst['emb'] + rel = edges.data['emb'] + score = head + rel - tail + return {'score': self.gamma - th.norm(score, p=1, dim=-1)} + + def prepare(self, g, gpu_id, trace=False): + pass + + def create_neg_prepare(self, neg_head): + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + return head, tail + return fn + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def update(self): + pass + + def reset_parameters(self): + pass + + def save(self, path, name): + pass + + def load(self, path, name): + pass + + def create_neg(self, neg_head): + gamma = self.gamma + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + heads = heads.reshape(num_chunks, neg_sample_size, hidden_dim) + tails = tails - relations + tails = tails.reshape(num_chunks, chunk_size, hidden_dim) + return gamma - th.cdist(tails, heads, p=1) + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + heads = heads + relations + heads = heads.reshape(num_chunks, chunk_size, hidden_dim) + tails = tails.reshape(num_chunks, neg_sample_size, hidden_dim) + return gamma - th.cdist(heads, tails, p=1) + return fn + +class TransRScore(nn.Module): + def __init__(self, gamma, projection_emb, relation_dim, entity_dim): + super(TransRScore, self).__init__() + self.gamma = gamma + self.projection_emb = projection_emb + self.relation_dim = relation_dim + self.entity_dim = entity_dim + + def edge_func(self, edges): + head = edges.data['head_emb'] + tail = edges.data['tail_emb'] + rel = edges.data['emb'] + score = head + rel - tail + return {'score': self.gamma - th.norm(score, p=1, dim=-1)} + + def prepare(self, g, gpu_id, trace=False): + head_ids, tail_ids = g.all_edges(order='eid') + projection = self.projection_emb(g.edata['id'], gpu_id, trace) + projection = projection.reshape(-1, self.entity_dim, self.relation_dim) + g.edata['head_emb'] = th.einsum('ab,abc->ac', g.ndata['emb'][head_ids], projection) + g.edata['tail_emb'] = th.einsum('ab,abc->ac', g.ndata['emb'][tail_ids], projection) + + def create_neg_prepare(self, neg_head): + if neg_head: + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + # pos node, project to its relation + projection = self.projection_emb(rel_id, gpu_id, trace) + projection = projection.reshape(num_chunks, -1, self.entity_dim, self.relation_dim) + tail = tail.reshape(num_chunks, -1, 1, self.entity_dim) + tail = th.matmul(tail, projection) + tail = tail.reshape(num_chunks, -1, self.relation_dim) + + # neg node, each project to all relations + head = head.reshape(num_chunks, 1, -1, self.entity_dim) + # (num_chunks, num_rel, num_neg_nodes, rel_dim) + head = th.matmul(head, projection) + return head, tail + return fn + else: + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + # pos node, project to its relation + projection = self.projection_emb(rel_id, gpu_id, trace) + projection = projection.reshape(num_chunks, -1, self.entity_dim, self.relation_dim) + head = head.reshape(num_chunks, -1, 1, self.entity_dim) + head = th.matmul(head, projection) + head = head.reshape(num_chunks, -1, self.relation_dim) + + # neg node, each project to all relations + tail = tail.reshape(num_chunks, 1, -1, self.entity_dim) + # (num_chunks, num_rel, num_neg_nodes, rel_dim) + tail = th.matmul(tail, projection) + return head, tail + return fn + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def reset_parameters(self): + self.projection_emb.init(1.0) + + def update(self): + self.projection_emb.update() + + def save(self, path, name): + self.projection_emb.save(path, name+'projection') + + def load(self, path, name): + self.projection_emb.load(path, name+'projection') + + def create_neg(self, neg_head): + gamma = self.gamma + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + relations = relations.reshape(num_chunks, -1, self.relation_dim) + tails = tails - relations + tails = tails.reshape(num_chunks, -1, 1, self.relation_dim) + score = heads - tails + return gamma - th.norm(score, p=1, dim=-1) + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + relations = relations.reshape(num_chunks, -1, self.relation_dim) + heads = heads - relations + heads = heads.reshape(num_chunks, -1, 1, self.relation_dim) + score = heads - tails + return gamma - th.norm(score, p=1, dim=-1) + return fn + +class DistMultScore(nn.Module): + def __init__(self): + super(DistMultScore, self).__init__() + + def edge_func(self, edges): + head = edges.src['emb'] + tail = edges.dst['emb'] + rel = edges.data['emb'] + score = head * rel * tail + # TODO: check if there exists minus sign and if gamma should be used here(jin) + return {'score': th.sum(score, dim=-1)} + + def prepare(self, g, gpu_id, trace=False): + pass + + def create_neg_prepare(self, neg_head): + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + return head, tail + return fn + + def update(self): + pass + + def reset_parameters(self): + pass + + def save(self, path, name): + pass + + def load(self, path, name): + pass + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def create_neg(self, neg_head): + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + heads = heads.reshape(num_chunks, neg_sample_size, hidden_dim) + heads = th.transpose(heads, 1, 2) + tmp = (tails * relations).reshape(num_chunks, chunk_size, hidden_dim) + return th.bmm(tmp, heads) + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = tails.shape[1] + tails = tails.reshape(num_chunks, neg_sample_size, hidden_dim) + tails = th.transpose(tails, 1, 2) + tmp = (heads * relations).reshape(num_chunks, chunk_size, hidden_dim) + return th.bmm(tmp, tails) + return fn + +class ComplExScore(nn.Module): + def __init__(self): + super(ComplExScore, self).__init__() + + def edge_func(self, edges): + real_head, img_head = th.chunk(edges.src['emb'], 2, dim=-1) + real_tail, img_tail = th.chunk(edges.dst['emb'], 2, dim=-1) + real_rel, img_rel = th.chunk(edges.data['emb'], 2, dim=-1) + + score = real_head * real_tail * real_rel \ + + img_head * img_tail * real_rel \ + + real_head * img_tail * img_rel \ + - img_head * real_tail * img_rel + # TODO: check if there exists minus sign and if gamma should be used here(jin) + return {'score': th.sum(score, -1)} + + def prepare(self, g, gpu_id, trace=False): + pass + + def create_neg_prepare(self, neg_head): + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + return head, tail + return fn + + def update(self): + pass + + def reset_parameters(self): + pass + + def save(self, path, name): + pass + + def load(self, path, name): + pass + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def create_neg(self, neg_head): + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + emb_real = tails[..., :hidden_dim // 2] + emb_imag = tails[..., hidden_dim // 2:] + rel_real = relations[..., :hidden_dim // 2] + rel_imag = relations[..., hidden_dim // 2:] + real = emb_real * rel_real + emb_imag * rel_imag + imag = -emb_real * rel_imag + emb_imag * rel_real + emb_complex = th.cat((real, imag), dim=-1) + tmp = emb_complex.reshape(num_chunks, chunk_size, hidden_dim) + heads = heads.reshape(num_chunks, neg_sample_size, hidden_dim) + heads = th.transpose(heads, 1, 2) + return th.bmm(tmp, heads) + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + emb_real = heads[..., :hidden_dim // 2] + emb_imag = heads[..., hidden_dim // 2:] + rel_real = relations[..., :hidden_dim // 2] + rel_imag = relations[..., hidden_dim // 2:] + real = emb_real * rel_real - emb_imag * rel_imag + imag = emb_real * rel_imag + emb_imag * rel_real + emb_complex = th.cat((real, imag), dim=-1) + tmp = emb_complex.reshape(num_chunks, chunk_size, hidden_dim) + tails = tails.reshape(num_chunks, neg_sample_size, hidden_dim) + tails = th.transpose(tails, 1, 2) + return th.bmm(tmp, tails) + return fn + +class RESCALScore(nn.Module): + def __init__(self, relation_dim, entity_dim): + super(RESCALScore, self).__init__() + self.relation_dim = relation_dim + self.entity_dim = entity_dim + + def edge_func(self, edges): + head = edges.src['emb'] + tail = edges.dst['emb'].unsqueeze(-1) + rel = edges.data['emb'] + rel = rel.view(-1, self.relation_dim, self.entity_dim) + score = head * th.matmul(rel, tail).squeeze(-1) + # TODO: check if use self.gamma + return {'score': th.sum(score, dim=-1)} + # return {'score': self.gamma - th.norm(score, p=1, dim=-1)} + + def prepare(self, g, gpu_id, trace=False): + pass + + def create_neg_prepare(self, neg_head): + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + return head, tail + return fn + + def update(self): + pass + + def reset_parameters(self): + pass + + def save(self, path, name): + pass + + def load(self, path, name): + pass + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def create_neg(self, neg_head): + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + heads = heads.reshape(num_chunks, neg_sample_size, hidden_dim) + heads = th.transpose(heads, 1, 2) + tails = tails.unsqueeze(-1) + relations = relations.view(-1, self.relation_dim, self.entity_dim) + tmp = th.matmul(relations, tails).squeeze(-1) + tmp = tmp.reshape(num_chunks, chunk_size, hidden_dim) + return th.bmm(tmp, heads) + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + tails = tails.reshape(num_chunks, neg_sample_size, hidden_dim) + tails = th.transpose(tails, 1, 2) + heads = heads.unsqueeze(-1) + relations = relations.view(-1, self.relation_dim, self.entity_dim) + tmp = th.matmul(relations, heads).squeeze(-1) + tmp = tmp.reshape(num_chunks, chunk_size, hidden_dim) + return th.bmm(tmp, tails) + return fn + +class RotatEScore(nn.Module): + def __init__(self, gamma, emb_init): + super(RotatEScore, self).__init__() + self.gamma = gamma + self.emb_init = emb_init + + def edge_func(self, edges): + re_head, im_head = th.chunk(edges.src['emb'], 2, dim=-1) + re_tail, im_tail = th.chunk(edges.dst['emb'], 2, dim=-1) + + phase_rel = edges.data['emb'] / (self.emb_init / np.pi) + re_rel, im_rel = th.cos(phase_rel), th.sin(phase_rel) + re_score = re_head * re_rel - im_head * im_rel + im_score = re_head * im_rel + im_head * re_rel + re_score = re_score - re_tail + im_score = im_score - im_tail + score = th.stack([re_score, im_score], dim=0) + score = score.norm(dim=0) + return {'score': self.gamma - score.sum(-1)} + + def update(self): + pass + + def reset_parameters(self): + pass + + def save(self, path, name): + pass + + def load(self, path, name): + pass + + def forward(self, g): + g.apply_edges(lambda edges: self.edge_func(edges)) + + def create_neg_prepare(self, neg_head): + def fn(rel_id, num_chunks, head, tail, gpu_id, trace=False): + return head, tail + return fn + + def prepare(self, g, gpu_id, trace=False): + pass + + def create_neg(self, neg_head): + gamma = self.gamma + emb_init = self.emb_init + if neg_head: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + emb_real = tails[..., :hidden_dim // 2] + emb_imag = tails[..., hidden_dim // 2:] + + phase_rel = relations / (emb_init / np.pi) + rel_real, rel_imag = th.cos(phase_rel), th.sin(phase_rel) + real = emb_real * rel_real + emb_imag * rel_imag + imag = -emb_real * rel_imag + emb_imag * rel_real + emb_complex = th.cat((real, imag), dim=-1) + tmp = emb_complex.reshape(num_chunks, chunk_size, 1, hidden_dim) + heads = heads.reshape(num_chunks, 1, neg_sample_size, hidden_dim) + score = tmp - heads + score = th.stack([score[..., :hidden_dim // 2], + score[..., hidden_dim // 2:]], dim=-1).norm(dim=-1) + return gamma - score.sum(-1) + + return fn + else: + def fn(heads, relations, tails, num_chunks, chunk_size, neg_sample_size): + hidden_dim = heads.shape[1] + emb_real = heads[..., :hidden_dim // 2] + emb_imag = heads[..., hidden_dim // 2:] + + phase_rel = relations / (emb_init / np.pi) + rel_real, rel_imag = th.cos(phase_rel), th.sin(phase_rel) + real = emb_real * rel_real - emb_imag * rel_imag + imag = emb_real * rel_imag + emb_imag * rel_real + + emb_complex = th.cat((real, imag), dim=-1) + tmp = emb_complex.reshape(num_chunks, chunk_size, 1, hidden_dim) + tails = tails.reshape(num_chunks, 1, neg_sample_size, hidden_dim) + score = tmp - tails + score = th.stack([score[..., :hidden_dim // 2], + score[..., hidden_dim // 2:]], dim=-1).norm(dim=-1) + + return gamma - score.sum(-1) + + return fn diff --git a/sagemaker-python-sdk/dgl_kge/models/pytorch/tensor_models.py b/sagemaker-python-sdk/dgl_kge/models/pytorch/tensor_models.py new file mode 100644 index 0000000000..ea2175dd20 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/models/pytorch/tensor_models.py @@ -0,0 +1,104 @@ +""" +Knowledge Graph Embedding Models. +1. TransE +2. DistMult +3. ComplEx +4. RotatE +5. pRotatE +6. TransH +7. TransR +8. TransD +9. RESCAL +""" +import os +import numpy as np + +import torch as th +import torch.nn as nn +import torch.nn.functional as functional +import torch.nn.init as INIT + +from .. import * + +logsigmoid = functional.logsigmoid + +def get_device(args): + return th.device('cpu') if args.gpu < 0 else th.device('cuda:' + str(args.gpu)) + +norm = lambda x, p: x.norm(p=p)**p + +get_scalar = lambda x: x.detach().item() + +reshape = lambda arr, x, y: arr.view(x, y) + +cuda = lambda arr, gpu: arr.cuda(gpu) + +class ExternalEmbedding: + def __init__(self, args, num, dim, device): + self.gpu = args.gpu + self.args = args + self.trace = [] + + self.emb = th.empty(num, dim, dtype=th.float32, device=device) + self.state_sum = self.emb.new().resize_(self.emb.size(0)).zero_() + self.state_step = 0 + + def init(self, emb_init): + INIT.uniform_(self.emb, -emb_init, emb_init) + INIT.zeros_(self.state_sum) + + def share_memory(self): + self.emb.share_memory_() + self.state_sum.share_memory_() + + def __call__(self, idx, gpu_id=-1, trace=True): + s = self.emb[idx] + if self.gpu >= 0: + s = s.cuda(self.gpu) + data = s.clone().detach().requires_grad_(True) + if trace: + self.trace.append((idx, data)) + return data + + def update(self): + self.state_step += 1 + with th.no_grad(): + for idx, data in self.trace: + grad = data.grad.data + + clr = self.args.lr + #clr = self.args.lr / (1 + (self.state_step - 1) * group['lr_decay']) + + # the update is non-linear so indices must be unique + grad_indices = idx + grad_values = grad + + grad_sum = (grad_values * grad_values).mean(1) + device = self.state_sum.device + if device != grad_indices.device: + grad_indices = grad_indices.to(device) + if device != grad_sum.device: + grad_sum = grad_sum.to(device) + self.state_sum.index_add_(0, grad_indices, grad_sum) + std = self.state_sum[grad_indices] # _sparse_mask + std_values = std.sqrt_().add_(1e-10).unsqueeze(1) + if self.gpu >= 0: + std_values = std_values.cuda(self.args.gpu) + tmp = (-clr * grad_values / std_values) + if tmp.device != device: + tmp = tmp.to(device) + # TODO(zhengda) the overhead is here. + self.emb.index_add_(0, grad_indices, tmp) + self.trace = [] + + def curr_emb(self): + data = [data for _, data in self.trace] + return th.cat(data, 0) + + def save(self, path, name): + file_name = os.path.join(path, name+'.npy') + np.save(file_name, self.emb.cpu().detach().numpy()) + + def load(self, path, name): + file_name = os.path.join(path, name+'.npy') + self.emb = th.Tensor(np.load(file_name)) diff --git a/sagemaker-python-sdk/dgl_kge/train.py b/sagemaker-python-sdk/dgl_kge/train.py new file mode 100644 index 0000000000..7e0e29472b --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/train.py @@ -0,0 +1,294 @@ +from dataloader import EvalDataset, TrainDataset, NewBidirectionalOneShotIterator +from dataloader import get_dataset + +import argparse +import os +import logging +import time + +backend = os.environ.get('DGLBACKEND') +if backend.lower() == 'mxnet': + import multiprocessing as mp + from train_mxnet import load_model + from train_mxnet import train + from train_mxnet import test +else: + import torch.multiprocessing as mp + from train_pytorch import load_model + from train_pytorch import train + from train_pytorch import test + +class ArgParser(argparse.ArgumentParser): + def __init__(self): + super(ArgParser, self).__init__() + + self.add_argument('--model_name', default='TransE', + choices=['TransE', 'TransH', 'TransR', 'TransD', + 'RESCAL', 'DistMult', 'ComplEx', 'RotatE', 'pRotatE'], + help='model to use') + self.add_argument('--data_path', type=str, default='data', + help='root path of all dataset') + self.add_argument('--dataset', type=str, default='FB15k', + help='dataset name, under data_path') + self.add_argument('--format', type=str, default='1', + help='the format of the dataset.') + self.add_argument('--save_path', type=str, default='ckpts', + help='place to save models and logs') + self.add_argument('--save_emb', type=str, default=None, + help='save the embeddings in the specific location.') + + self.add_argument('--max_step', type=int, default=80000, + help='train xx steps') + self.add_argument('--warm_up_step', type=int, default=None, + help='for learning rate decay') + self.add_argument('--batch_size', type=int, default=1024, + help='batch size') + self.add_argument('--batch_size_eval', type=int, default=8, + help='batch size used for eval and test') + self.add_argument('--neg_sample_size', type=int, default=128, + help='negative sampling size') + self.add_argument('--neg_sample_size_valid', type=int, default=1000, + help='negative sampling size for validation') + self.add_argument('--neg_sample_size_test', type=int, default=-1, + help='negative sampling size for testing') + self.add_argument('--hidden_dim', type=int, default=256, + help='hidden dim used by relation and entity') + self.add_argument('--lr', type=float, default=0.0001, + help='learning rate') + self.add_argument('-g', '--gamma', type=float, default=12.0, + help='margin value') + self.add_argument('--eval_percent', type=float, default=1, + help='sample some percentage for evaluation.') + + self.add_argument('--gpu', type=int, default=-1, + help='use GPU') + self.add_argument('--mix_cpu_gpu', action='store_true', + help='mix CPU and GPU training') + self.add_argument('-de', '--double_ent', action='store_true', + help='double entitiy dim for complex number') + self.add_argument('-dr', '--double_rel', action='store_true', + help='double relation dim for complex number') + self.add_argument('--seed', type=int, default=0, + help='set random seed fro reproducibility') + self.add_argument('-log', '--log_interval', type=int, default=1000, + help='do evaluation after every x steps') + self.add_argument('--eval_interval', type=int, default=10000, + help='do evaluation after every x steps') + self.add_argument('-adv', '--neg_adversarial_sampling', type=bool, default=True, + help='if use negative adversarial sampling') + self.add_argument('-a', '--adversarial_temperature', default=1.0, type=float) + + self.add_argument('--valid', type=bool, default=True, + help='if valid a model') + self.add_argument('--test', type=bool, default=True, + help='if test a model') + self.add_argument('-rc', '--regularization_coef', type=float, default=0.000002, + help='set value > 0.0 if regularization is used') + self.add_argument('-rn', '--regularization_norm', type=int, default=3, + help='norm used in regularization') + self.add_argument('--num_worker', type=int, default=16, + help='number of workers used for loading data') + self.add_argument('--non_uni_weight', action='store_true', + help='if use uniform weight when computing loss') + self.add_argument('--init_step', type=int, default=0, + help='DONT SET MANUALLY, used for resume') + self.add_argument('--step', type=int, default=0, + help='DONT SET MANUALLY, track current step') + self.add_argument('--pickle_graph', action='store_true', + help='pickle built graph, building a huge graph is slow.') + self.add_argument('--num_proc', type=int, default=1, + help='number of process used') + self.add_argument('--rel_part', action='store_true', + help='enable relation partitioning') + + +def get_logger(args): + if not os.path.exists(args.save_path): + os.mkdir(args.save_path) + + folder = '{}_{}_'.format(args.model_name, args.dataset) + n = len([x for x in os.listdir(args.save_path) if x.startswith(folder)]) + folder += str(n) + args.save_path = os.path.join(args.save_path, folder) + + if not os.path.exists(args.save_path): + os.makedirs(args.save_path) + log_file = os.path.join(args.save_path, 'train.log') + + logging.basicConfig( + format='%(asctime)s %(levelname)-8s %(message)s', + level=logging.INFO, + datefmt='%Y-%m-%d %H:%M:%S', + filename=log_file, + filemode='w' + ) + + logger = logging.getLogger(__name__) + print("Logs are being recorded at: {}".format(log_file)) + return logger + + +def run(args, logger): + # load dataset and samplers + dataset = get_dataset(args.data_path, args.dataset, args.format) + n_entities = dataset.n_entities + n_relations = dataset.n_relations + if args.neg_sample_size_test < 0: + args.neg_sample_size_test = n_entities + + train_data = TrainDataset(dataset, args, ranks=args.num_proc) + if args.num_proc > 1: + train_samplers = [] + for i in range(args.num_proc): + train_sampler_head = train_data.create_sampler(args.batch_size, args.neg_sample_size, + mode='PBG-head', + num_workers=args.num_worker, + shuffle=True, + exclude_positive=True, + rank=i) + train_sampler_tail = train_data.create_sampler(args.batch_size, args.neg_sample_size, + mode='PBG-tail', + num_workers=args.num_worker, + shuffle=True, + exclude_positive=True, + rank=i) + train_samplers.append(NewBidirectionalOneShotIterator(train_sampler_head, train_sampler_tail, + True, n_entities)) + else: + train_sampler_head = train_data.create_sampler(args.batch_size, args.neg_sample_size, + mode='PBG-head', + num_workers=args.num_worker, + shuffle=True, + exclude_positive=True) + train_sampler_tail = train_data.create_sampler(args.batch_size, args.neg_sample_size, + mode='PBG-tail', + num_workers=args.num_worker, + shuffle=True, + exclude_positive=True) + train_sampler = NewBidirectionalOneShotIterator(train_sampler_head, train_sampler_tail, + True, n_entities) + + if args.valid or args.test: + eval_dataset = EvalDataset(dataset, args) + if args.valid: + # Here we want to use the regualr negative sampler because we need to ensure that + # all positive edges are excluded. + if args.num_proc > 1: + valid_sampler_heads = [] + valid_sampler_tails = [] + for i in range(args.num_proc): + valid_sampler_head = eval_dataset.create_sampler('valid', args.batch_size_eval, + args.neg_sample_size_valid, + mode='PBG-head', + num_workers=args.num_worker, + rank=i, ranks=args.num_proc) + valid_sampler_tail = eval_dataset.create_sampler('valid', args.batch_size_eval, + args.neg_sample_size_valid, + mode='PBG-tail', + num_workers=args.num_worker, + rank=i, ranks=args.num_proc) + valid_sampler_heads.append(valid_sampler_head) + valid_sampler_tails.append(valid_sampler_tail) + else: + valid_sampler_head = eval_dataset.create_sampler('valid', args.batch_size_eval, + args.neg_sample_size_valid, + mode='PBG-head', + num_workers=args.num_worker, + rank=0, ranks=1) + valid_sampler_tail = eval_dataset.create_sampler('valid', args.batch_size_eval, + args.neg_sample_size_valid, + mode='PBG-tail', + num_workers=args.num_worker, + rank=0, ranks=1) + if args.test: + # Here we want to use the regualr negative sampler because we need to ensure that + # all positive edges are excluded. + if args.num_proc > 1: + test_sampler_tails = [] + test_sampler_heads = [] + for i in range(args.num_proc): + test_sampler_head = eval_dataset.create_sampler('test', args.batch_size_eval, + args.neg_sample_size_test, + mode='PBG-head', + num_workers=args.num_worker, + rank=i, ranks=args.num_proc) + test_sampler_tail = eval_dataset.create_sampler('test', args.batch_size_eval, + args.neg_sample_size_test, + mode='PBG-tail', + num_workers=args.num_worker, + rank=i, ranks=args.num_proc) + test_sampler_heads.append(test_sampler_head) + test_sampler_tails.append(test_sampler_tail) + else: + test_sampler_head = eval_dataset.create_sampler('test', args.batch_size_eval, + args.neg_sample_size_test, + mode='PBG-head', + num_workers=args.num_worker, + rank=0, ranks=1) + test_sampler_tail = eval_dataset.create_sampler('test', args.batch_size_eval, + args.neg_sample_size_test, + mode='PBG-tail', + num_workers=args.num_worker, + rank=0, ranks=1) + + # We need to free all memory referenced by dataset. + eval_dataset = None + dataset = None + # load model + model = load_model(logger, args, n_entities, n_relations) + + if args.num_proc > 1: + model.share_memory() + + # train + start = time.time() + if args.num_proc > 1: + procs = [] + for i in range(args.num_proc): + valid_samplers = [valid_sampler_heads[i], valid_sampler_tails[i]] if args.valid else None + proc = mp.Process(target=train, args=(args, model, train_samplers[i], valid_samplers)) + procs.append(proc) + proc.start() + for proc in procs: + proc.join() + else: + valid_samplers = [valid_sampler_head, valid_sampler_tail] if args.valid else None + train(args, model, train_sampler, valid_samplers) + print('training takes {} seconds'.format(time.time() - start)) + + if args.save_emb is not None: + if not os.path.exists(args.save_emb): + os.mkdir(args.save_emb) + model.save_emb(args.save_emb, args.dataset) + + # test + if args.test: + if args.num_proc > 1: + procs = [] + for i in range(args.num_proc): + proc = mp.Process(target=test, args=(args, model, [test_sampler_heads[i], test_sampler_tails[i]])) + procs.append(proc) + proc.start() + for proc in procs: + proc.join() + else: + test(args, model, [test_sampler_head, test_sampler_tail]) + +if __name__ == '__main__': + args = ArgParser().parse_args() + + # sagemaker related args + num_gpus = int(os.environ['SM_NUM_GPUS']) + if num_gpus == 0: + args.gpu = -1 + else: + # only use gpu0 now + args.gpu = 0 + + # specify model save location + args.save_path = str(os.environ['SM_MODEL_DIR']) + args.save_emb = os.path.join(args.save_path, 'emb') + print(args) + + logger = get_logger(args) + run(args, logger) diff --git a/sagemaker-python-sdk/dgl_kge/train_mxnet.py b/sagemaker-python-sdk/dgl_kge/train_mxnet.py new file mode 100644 index 0000000000..0d7b32f1f6 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/train_mxnet.py @@ -0,0 +1,81 @@ +from models import KEModel + +import mxnet as mx +from mxnet import gluon +from mxnet import ndarray as nd + +import os +import logging +import time +import json + +def load_model(logger, args, n_entities, n_relations, ckpt=None): + model = KEModel(args, args.model_name, n_entities, n_relations, + args.hidden_dim, args.gamma, + double_entity_emb=args.double_ent, double_relation_emb=args.double_rel) + if ckpt is not None: + # TODO: loading model emb only work for genernal Embedding, not for ExternalEmbedding + if args.gpu >= 0: + model.load_parameters(ckpt, ctx=mx.gpu(args.gpu)) + else: + model.load_parameters(ckpt, ctx=mx.cpu()) + + logger.info('Load model {}'.format(args.model_name)) + return model + +def load_model_from_checkpoint(logger, args, n_entities, n_relations, ckpt_path): + model = load_model(logger, args, n_entities, n_relations) + model.load_emb(ckpt_path, args.dataset) + return model + +def train(args, model, train_sampler, valid_samplers=None): + if args.num_proc > 1: + os.environ['OMP_NUM_THREADS'] = '1' + logs = [] + + for arg in vars(args): + logging.info('{:20}:{}'.format(arg, getattr(args, arg))) + + start = time.time() + for step in range(args.init_step, args.max_step): + pos_g, neg_g = next(train_sampler) + args.step = step + with mx.autograd.record(): + loss, log = model.forward(pos_g, neg_g, args.gpu) + loss.backward() + logs.append(log) + model.update() + + if step % args.log_interval == 0: + for k in logs[0].keys(): + v = sum(l[k] for l in logs) / len(logs) + print('[Train]({}/{}) average {}: {}'.format(step, args.max_step, k, v)) + logs = [] + print(time.time() - start) + start = time.time() + + if args.valid and step % args.eval_interval == 0 and step > 1 and valid_samplers is not None: + start = time.time() + test(args, model, valid_samplers, mode='Valid') + print('test:', time.time() - start) + # clear cache + logs = [] + +def test(args, model, test_samplers, mode='Test'): + logs = [] + + for sampler in test_samplers: + #print('Number of tests: ' + len(sampler)) + count = 0 + for pos_g, neg_g in sampler: + model.forward_test(pos_g, neg_g, logs, args.gpu) + + metrics = {} + if len(logs) > 0: + for metric in logs[0].keys(): + metrics[metric] = sum([log[metric] for log in logs]) / len(logs) + + for k, v in metrics.items(): + print('{} average {} at [{}/{}]: {}'.format(mode, k, args.step, args.max_step, v)) + for i in range(len(test_samplers)): + test_samplers[i] = test_samplers[i].reset() diff --git a/sagemaker-python-sdk/dgl_kge/train_pytorch.py b/sagemaker-python-sdk/dgl_kge/train_pytorch.py new file mode 100644 index 0000000000..a1c0d257d9 --- /dev/null +++ b/sagemaker-python-sdk/dgl_kge/train_pytorch.py @@ -0,0 +1,101 @@ +from models import KEModel + +from torch.utils.data import DataLoader +import torch.optim as optim +import torch as th +import torch.multiprocessing as mp + +from distutils.version import LooseVersion +TH_VERSION = LooseVersion(th.__version__) +if TH_VERSION.version[0] == 1 and TH_VERSION.version[1] < 2: + raise Exception("DGL-ke has to work with Pytorch version >= 1.2") + +import os +import logging +import time + +def load_model(logger, args, n_entities, n_relations, ckpt=None): + model = KEModel(args, args.model_name, n_entities, n_relations, + args.hidden_dim, args.gamma, + double_entity_emb=args.double_ent, double_relation_emb=args.double_rel) + if ckpt is not None: + # TODO: loading model emb only work for genernal Embedding, not for ExternalEmbedding + model.load_state_dict(ckpt['model_state_dict']) + return model + + +def load_model_from_checkpoint(logger, args, n_entities, n_relations, ckpt_path): + model = load_model(logger, args, n_entities, n_relations) + model.load_emb(ckpt_path, args.dataset) + return model + +def train(args, model, train_sampler, valid_samplers=None): + if args.num_proc > 1: + th.set_num_threads(1) + logs = [] + for arg in vars(args): + logging.info('{:20}:{}'.format(arg, getattr(args, arg))) + + start = time.time() + update_time = 0 + forward_time = 0 + backward_time = 0 + for step in range(args.init_step, args.max_step): + pos_g, neg_g = next(train_sampler) + args.step = step + + start1 = time.time() + loss, log = model.forward(pos_g, neg_g) + forward_time += time.time() - start1 + + start1 = time.time() + loss.backward() + backward_time += time.time() - start1 + + start1 = time.time() + model.update() + update_time += time.time() - start1 + logs.append(log) + + if step % args.log_interval == 0: + for k in logs[0].keys(): + v = sum(l[k] for l in logs) / len(logs) + print('[Train]({}/{}) average {}: {}'.format(step, args.max_step, k, v)) + logs = [] + print('[Train] {} steps take {:.3f} seconds'.format(args.log_interval, + time.time() - start)) + print('forward: {:.3f}, backward: {:.3f}, update: {:.3f}'.format(forward_time, + backward_time, + update_time)) + update_time = 0 + forward_time = 0 + backward_time = 0 + start = time.time() + + if args.valid and step % args.eval_interval == 0 and step > 1 and valid_samplers is not None: + start = time.time() + test(args, model, valid_samplers, mode='Valid') + print('test:', time.time() - start) + +def test(args, model, test_samplers, mode='Test'): + if args.num_proc > 1: + th.set_num_threads(1) + start = time.time() + with th.no_grad(): + logs = [] + for sampler in test_samplers: + count = 0 + for pos_g, neg_g in sampler: + with th.no_grad(): + model.forward_test(pos_g, neg_g, logs, args.gpu) + + metrics = {} + if len(logs) > 0: + for metric in logs[0].keys(): + metrics[metric] = sum([log[metric] for log in logs]) / len(logs) + + for k, v in metrics.items(): + print('{} average {} at [{}/{}]: {}'.format(mode, k, args.step, args.max_step, v)) + print('test:', time.time() - start) + test_samplers[0] = test_samplers[0].reset() + test_samplers[1] = test_samplers[1].reset() diff --git a/sagemaker_model_monitor/enable_model_monitor/SageMaker-Enable-Model-Monitor.ipynb b/sagemaker_model_monitor/enable_model_monitor/SageMaker-Enable-Model-Monitor.ipynb new file mode 100644 index 0000000000..5d12cb78ae --- /dev/null +++ b/sagemaker_model_monitor/enable_model_monitor/SageMaker-Enable-Model-Monitor.ipynb @@ -0,0 +1,302 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Enable Amazon SageMaker Model Monitor\n", + "\n", + "Amazon SageMaker provides the ability to monitor machine learning models in production and detect deviations in data quality in comparison to a baseline dataset (e.g. training data set). This notebook walks you through enabling data capture and setting up continous monitoring for an existing Endpoint.\n", + "\n", + "This Notebook helps with the following:\n", + "* Update your existing SageMaker Endpoint to enable Model Monitoring\n", + "* Analyze the training dataset to generate a baseline constraint\n", + "* Setup a MonitoringSchedule for monitoring deviations from the specified baseline\n", + "\n", + "---" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Step 1: Enable real-time inference data capture" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To enable data capture for monitoring the model data quality, you specify the new capture option called `DataCaptureConfig`. You can capture the request payload, the response payload or both with this configuration. The capture config applies to all variants. Please provide the Endpoint name in the following cell:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Please fill in the following for enabling data capture\n", + "endpoint_name = 'FILL-IN-HERE-YOUR-ENDPOINT-NAME'\n", + "s3_capture_upload_path = 'FILL-IN-HERE-YOUR-S3-BUCKET-PREFIX-HERE' #example: s3://bucket-name/path/to/endpoint-data-capture/\n", + "\n", + "##### \n", + "## IMPORTANT\n", + "##\n", + "## Please make sure to add the \"s3:PutObject\" permission to the \"role' you provided in the SageMaker Model \n", + "## behind this Endpoint. Otherwise, Endpoint data capture will not work.\n", + "## \n", + "##### " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.model_monitor import DataCaptureConfig\n", + "from sagemaker import RealTimePredictor\n", + "from sagemaker import session\n", + "import boto3\n", + "sm_session = session.Session(boto3.Session())\n", + "\n", + "# Change parameters as you would like - adjust sampling percentage, \n", + "# chose to capture request or response or both.\n", + "# Learn more from our documentation\n", + "data_capture_config = DataCaptureConfig(\n", + " enable_capture = True,\n", + " sampling_percentage=50,\n", + " destination_s3_uri=s3_capture_upload_path,\n", + " kms_key_id=None,\n", + " capture_options=[\"REQUEST\", \"RESPONSE\"],\n", + " csv_content_types=[\"text/csv\"],\n", + " json_content_types=[\"application/json\"])\n", + "\n", + "# Now it is time to apply the new configuration and wait for it to be applied\n", + "predictor = RealTimePredictor(endpoint=endpoint_name)\n", + "predictor.update_data_capture_config(data_capture_config=data_capture_config)\n", + "sm_session.wait_for_endpoint(endpoint=endpoint_name)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Before you proceed:\n", + "Currently SageMaker supports monitoring Endpoints out of the box only for **tabular (csv, flat-json)** datasets. If your Endpoint uses some other datasets, these following steps will NOT work for you.\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Step 2: Model Monitor - Baseling" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In addition to collecting the data, SageMaker allows you to monitor and evaluate the data observed by the Endpoints. For this :\n", + "1. We need to create a baseline with which we compare the realtime traffic against. \n", + "1. Once a baseline is ready, we can setup a schedule to continously evaluate/compare against the baseline." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Constraint suggestion with baseline/training dataset" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The training dataset with which you trained the model is usually a good baseline dataset. Note that the training dataset's data schema and the inference dataset schema should exactly match (i.e. number and order of the features).\n", + "\n", + "Using our training dataset, we'll ask SageMaker to suggest a set of baseline constraints and generate descriptive statistics to explore the data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "baseline_data_uri = 'FILL-ME-IN' ##'s3://bucketname/path/to/baseline/data' - Where your training data is\n", + "baseline_results_uri = 'FILL-ME-IN' ##'s3://bucketname/path/to/baseline/data' - Where the results are to be stored in\n", + "\n", + "print('Baseline data uri: {}'.format(baseline_data_uri))\n", + "print('Baseline results uri: {}'.format(baseline_results_uri))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Create a baselining job with the training dataset" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that we have the training data ready in S3, let's kick off a job to `suggest` constraints. `DefaultModelMonitor.suggest_baseline(..)` kicks off a `ProcessingJob` using a SageMaker provided Model Monitor container to generate the constraints. Please edit the configurations to fit your needs." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.model_monitor import DefaultModelMonitor\n", + "from sagemaker.model_monitor.dataset_format import DatasetFormat\n", + "from sagemaker import get_execution_role\n", + "\n", + "role = get_execution_role()\n", + "\n", + "my_default_monitor = DefaultModelMonitor(\n", + " role=role,\n", + " instance_count=1,\n", + " instance_type='ml.m5.xlarge',\n", + " volume_size_in_gb=20,\n", + " max_runtime_in_seconds=3600,\n", + ")\n", + "\n", + "my_default_monitor.suggest_baseline(\n", + " baseline_dataset=baseline_data_uri+'/training-dataset-with-header.csv',\n", + " dataset_format=DatasetFormat.csv(header=True),\n", + " output_s3_uri=baseline_results_uri,\n", + " wait=True\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Explore the generated constraints and statistics" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "\n", + "baseline_job = my_default_monitor.latest_baselining_job\n", + "schema_df = pd.io.json.json_normalize(baseline_job.baseline_statistics().body_dict[\"features\"])\n", + "schema_df.head(10)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "constraints_df = pd.io.json.json_normalize(baseline_job.suggested_constraints().body_dict[\"features\"])\n", + "constraints_df.head(10)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before proceeding to enable monitoring, you could chose to edit the constraint file as required to fine tune the constraints." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Step 3: Enable continous monitoring\n", + "\n", + "We have collected the data above, here we proceed to analyze and monitor the data with MonitoringSchedules." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Create a schedule" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We are ready to create a model monitoring schedule for the Endpoint created earlier with the baseline resources (constraints and statistics)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.model_monitor import CronExpressionGenerator\n", + "from time import gmtime, strftime\n", + "\n", + "mon_schedule_name = 'FILL-IN-HERE'\n", + "s3_report_path = 'FILL-IN-HERE'\n", + "my_default_monitor.create_monitoring_schedule(\n", + " monitor_schedule_name=mon_schedule_name,\n", + " endpoint_input=predictor.endpoint,\n", + " output_s3_uri=s3_report_path,\n", + " statistics=my_default_monitor.baseline_statistics(),\n", + " constraints=my_default_monitor.suggested_constraints(),\n", + " schedule_cron_expression=CronExpressionGenerator.daily(),\n", + " enable_cloudwatch_metrics=True,\n", + "\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "desc_schedule_result = my_default_monitor.describe_schedule()\n", + "print('Schedule status: {}'.format(desc_schedule_result['MonitoringScheduleStatus']))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### All set\n", + "Now that your monitoring schedule has been created. Please return to the Amazon SageMaker Studio to list the executions for this Schedule and observe the results going forward." + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "conda_python3", + "language": "python", + "name": "conda_python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + }, + "notice": "Copyright 2019 Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the \"License\"). You may not use this file except in compliance with the License. A copy of the License is located at http://aws.amazon.com/apache2.0/ or in the \"license\" file accompanying this file. This file is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License." + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/sagemaker_model_monitor/introduction/SageMaker-ModelMonitoring.ipynb b/sagemaker_model_monitor/introduction/SageMaker-ModelMonitoring.ipynb new file mode 100644 index 0000000000..74d7e6d592 --- /dev/null +++ b/sagemaker_model_monitor/introduction/SageMaker-ModelMonitoring.ipynb @@ -0,0 +1,742 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Amazon SageMaker Model Monitor\n", + "This notebook shows how to:\n", + "* Host a machine learning model in Amazon SageMaker and capture inference requests, results, and metadata \n", + "* Analyze a training dataset to generate baseline constraints\n", + "* Monitor a live endpoint for violations against constraints\n", + "\n", + "---\n", + "## Background\n", + "\n", + "Amazon SageMaker provides every developer and data scientist with the ability to build, train, and deploy machine learning models quickly. Amazon SageMaker is a fully-managed service that encompasses the entire machine learning workflow. You can label and prepare your data, choose an algorithm, train a model, and then tune and optimize it for deployment. You can deploy your models to production with Amazon SageMaker to make predictions and lower costs than was previously possible.\n", + "\n", + "In addition, Amazon SageMaker enables you to capture the input, output and metadata for invocations of the models that you deploy. It also enables you to analyze the data and monitor its quality. In this notebook, you learn how Amazon SageMaker enables these capabilities.\n", + "\n", + "---\n", + "## Setup\n", + "\n", + "To get started, make sure you have these prerequisites completed.\n", + "\n", + "* Specify an AWS Region to host your model.\n", + "* An IAM role ARN exists that is used to give Amazon SageMaker access to your data in Amazon Simple Storage Service (Amazon S3). See the documentation for how to fine tune the permissions needed. \n", + "* Create an S3 bucket used to store the data used to train your model, any additional model data, and the data captured from model invocations. For demonstration purposes, you are using the same bucket for these. In reality, you might want to separate them with different security policies." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "isConfigCell": true + }, + "outputs": [], + "source": [ + "%%time\n", + "\n", + "# Handful of configuration\n", + "\n", + "import os\n", + "import boto3\n", + "import re\n", + "import json\n", + "from sagemaker import get_execution_role, session\n", + "\n", + "region= boto3.Session().region_name\n", + "\n", + "role = get_execution_role()\n", + "print(\"RoleArn: {}\".format(role))\n", + "\n", + "# You can use a different bucket, but make sure the role you chose for this notebook\n", + "# has the s3:PutObject permissions. This is the bucket into which the data is captured\n", + "bucket = session.Session(boto3.Session()).default_bucket()\n", + "print(\"Demo Bucket: {}\".format(bucket))\n", + "prefix = 'sagemaker/DEMO-ModelMonitor'\n", + "\n", + "data_capture_prefix = '{}/datacapture'.format(prefix)\n", + "s3_capture_upload_path = 's3://{}/{}'.format(bucket, data_capture_prefix)\n", + "reports_prefix = '{}/reports'.format(prefix)\n", + "s3_report_path = 's3://{}/{}'.format(bucket,reports_prefix)\n", + "code_prefix = '{}/code'.format(prefix)\n", + "s3_code_preprocessor_uri = 's3://{}/{}/{}'.format(bucket,code_prefix, 'preprocessor.py')\n", + "s3_code_postprocessor_uri = 's3://{}/{}/{}'.format(bucket,code_prefix, 'postprocessor.py')\n", + "\n", + "print(\"Capture path: {}\".format(s3_capture_upload_path))\n", + "print(\"Report path: {}\".format(s3_report_path))\n", + "print(\"Preproc Code path: {}\".format(s3_code_preprocessor_uri))\n", + "print(\"Postproc Code path: {}\".format(s3_code_postprocessor_uri))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can quickly verify that the execution role for this notebook has the necessary permissions to proceed. Put a simple test object into the S3 bucket you specified above. If this command fails, update the role to have `s3:PutObject` permission on the bucket and try again." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Upload some test files\n", + "boto3.Session().resource('s3').Bucket(bucket).Object(\"test_upload/test.txt\").upload_file('test_data/upload-test-file.txt')\n", + "print(\"Success! You are all set to proceed.\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# PART A: Capturing real-time inference data from Amazon SageMaker endpoints\n", + "Create an endpoint to showcase the data capture capability in action.\n", + "\n", + "### Upload the pre-trained model to Amazon S3\n", + "This code uploads a pre-trained XGBoost model that is ready for you to deploy. This model was trained using the XGB Churn Prediction Notebook in SageMaker. You can also use your own pre-trained model in this step. If you already have a pretrained model in Amazon S3, you can add it instead by specifying the s3_key." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "model_file = open(\"model/xgb-churn-prediction-model.tar.gz\", 'rb')\n", + "s3_key = os.path.join(prefix, 'xgb-churn-prediction-model.tar.gz')\n", + "boto3.Session().resource('s3').Bucket(bucket).Object(s3_key).upload_fileobj(model_file)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Deploy the model to Amazon SageMaker\n", + "Start with deploying a pre-trained churn prediction model. Here, you create the model object with the image and model data." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from time import gmtime, strftime\n", + "from sagemaker.model import Model\n", + "from sagemaker.amazon.amazon_estimator import get_image_uri\n", + "\n", + "model_name = \"DEMO-xgb-churn-pred-model-monitor-\" + strftime(\"%Y-%m-%d-%H-%M-%S\", gmtime())\n", + "model_url = 'https://{}.s3-{}.amazonaws.com/{}/xgb-churn-prediction-model.tar.gz'.format(bucket, region, prefix)\n", + "image_uri = get_image_uri(boto3.Session().region_name, 'xgboost', '0.90-1')\n", + "\n", + "model = Model(image=image_uri, model_data=model_url, role=role)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To enable data capture for monitoring the model data quality, you specify the new capture option called `DataCaptureConfig`. You can capture the request payload, the response payload or both with this configuration. The capture config applies to all variants. Go ahead with the deployment." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.model_monitor import DataCaptureConfig\n", + "\n", + "endpoint_name = 'DEMO-xgb-churn-pred-model-monitor-' + strftime(\"%Y-%m-%d-%H-%M-%S\", gmtime())\n", + "print(\"EndpointName={}\".format(endpoint_name))\n", + "\n", + "data_capture_config = DataCaptureConfig(\n", + " enable_capture=True,\n", + " sampling_percentage=100,\n", + " destination_s3_uri=s3_capture_upload_path)\n", + "\n", + "predictor = model.deploy(initial_instance_count=1,\n", + " instance_type='ml.m4.xlarge',\n", + " endpoint_name=endpoint_name,\n", + " data_capture_config=data_capture_config)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Invoke the deployed model\n", + "\n", + "You can now send data to this endpoint to get inferences in real time. Because you enabled the data capture in the previous steps, the request and response payload, along with some additional metadata, is saved in the Amazon Simple Storage Service (Amazon S3) location you have specified in the DataCaptureConfig." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This step invokes the endpoint with included sample data for about 2 minutes. Data is captured based on the sampling percentage specified and the capture continues until the data capture option is turned off." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.predictor import RealTimePredictor\n", + "import time\n", + "\n", + "predictor = RealTimePredictor(endpoint=endpoint_name,content_type='text/csv')\n", + "\n", + "# get a subset of test data for a quick test\n", + "!head -120 test_data/test-dataset-input-cols.csv > test_data/test_sample.csv\n", + "print(\"Sending test traffic to the endpoint {}. \\nPlease wait...\".format(endpoint_name))\n", + "\n", + "with open('test_data/test_sample.csv', 'r') as f:\n", + " for row in f:\n", + " payload = row.rstrip('\\n')\n", + " response = predictor.predict(data=payload)\n", + " time.sleep(0.5)\n", + " \n", + "print(\"Done!\") " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## View captured data\n", + "\n", + "Now list the data capture files stored in Amazon S3. You should expect to see different files from different time periods organized based on the hour in which the invocation occurred. The format of the Amazon S3 path is:\n", + "\n", + "`s3://{destination-bucket-prefix}/{endpoint-name}/{variant-name}/yyyy/mm/dd/hh/filename.jsonl`" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "s3_client = boto3.Session().client('s3')\n", + "current_endpoint_capture_prefix = '{}/{}'.format(data_capture_prefix, endpoint_name)\n", + "result = s3_client.list_objects(Bucket=bucket, Prefix=current_endpoint_capture_prefix)\n", + "capture_files = [capture_file.get(\"Key\") for capture_file in result.get('Contents')]\n", + "print(\"Found Capture Files:\")\n", + "print(\"\\n \".join(capture_files))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Next, view the contents of a single capture file. Here you should see all the data captured in an Amazon SageMaker specific JSON-line formatted file. Take a quick peek at the first few lines in the captured file." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "def get_obj_body(obj_key):\n", + " return s3_client.get_object(Bucket=bucket, Key=obj_key).get('Body').read().decode(\"utf-8\")\n", + "\n", + "capture_file = get_obj_body(capture_files[-1])\n", + "print(capture_file[:2000])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, the contents of a single line is present below in a formatted JSON file so that you can observe a little better." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "import json\n", + "print(json.dumps(json.loads(capture_file.split('\\n')[0]), indent=2))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As you can see, each inference request is captured in one line in the jsonl file. The line contains both the input and output merged together. In the example, you provided the ContentType as `text/csv` which is reflected in the `observedContentType` value. Also, you expose the encoding that you used to encode the input and output payloads in the capture format with the `encoding` value.\n", + "\n", + "To recap, you observed how you can enable capturing the input or output payloads to an endpoint with a new parameter. You have also observed what the captured format looks like in Amazon S3. Next, continue to explore how Amazon SageMaker helps with monitoring the data collected in Amazon S3." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# PART B: Model Monitor - Baseling and continuous monitoring" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In addition to collecting the data, Amazon SageMaker provides the capability for you to monitor and evaluate the data observed by the endpoints. For this:\n", + "1. Create a baseline with which you compare the realtime traffic. \n", + "1. Once a baseline is ready, setup a schedule to continously evaluate and compare against the baseline." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 1. Constraint suggestion with baseline/training dataset" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The training dataset with which you trained the model is usually a good baseline dataset. Note that the training dataset data schema and the inference dataset schema should exactly match (i.e. the number and order of the features).\n", + "\n", + "From the training dataset you can ask Amazon SageMaker to suggest a set of baseline `constraints` and generate descriptive `statistics` to explore the data. For this example, upload the training dataset that was used to train the pre-trained model included in this example. If you already have it in Amazon S3, you can directly point to it." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# copy over the training dataset to Amazon S3 (if you already have it in Amazon S3, you could reuse it)\n", + "baseline_prefix = prefix + '/baselining'\n", + "baseline_data_prefix = baseline_prefix + '/data'\n", + "baseline_results_prefix = baseline_prefix + '/results'\n", + "\n", + "baseline_data_uri = 's3://{}/{}'.format(bucket,baseline_data_prefix)\n", + "baseline_results_uri = 's3://{}/{}'.format(bucket, baseline_results_prefix)\n", + "print('Baseline data uri: {}'.format(baseline_data_uri))\n", + "print('Baseline results uri: {}'.format(baseline_results_uri))\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "training_data_file = open(\"test_data/training-dataset-with-header.csv\", 'rb')\n", + "s3_key = os.path.join(baseline_prefix, 'data', 'training-dataset-with-header.csv')\n", + "boto3.Session().resource('s3').Bucket(bucket).Object(s3_key).upload_fileobj(training_data_file)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Create a baselining job with training dataset" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Now that you have the training data ready in Amazon S3, start a job to `suggest` constraints. `DefaultModelMonitor.suggest_baseline(..)` starts a `ProcessingJob` using an Amazon SageMaker provided Model Monitor container to generate the constraints." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.model_monitor import DefaultModelMonitor\n", + "from sagemaker.model_monitor.dataset_format import DatasetFormat\n", + "\n", + "my_default_monitor = DefaultModelMonitor(\n", + " role=role,\n", + " instance_count=1,\n", + " instance_type='ml.m5.xlarge',\n", + " volume_size_in_gb=20,\n", + " max_runtime_in_seconds=3600,\n", + ")\n", + "\n", + "my_default_monitor.suggest_baseline(\n", + " baseline_dataset=baseline_data_uri+'/training-dataset-with-header.csv',\n", + " dataset_format=DatasetFormat.csv(header=True),\n", + " output_s3_uri=baseline_results_uri,\n", + " wait=True\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Explore the generated constraints and statistics" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "s3_client = boto3.Session().client('s3')\n", + "result = s3_client.list_objects(Bucket=bucket, Prefix=baseline_results_prefix)\n", + "report_files = [report_file.get(\"Key\") for report_file in result.get('Contents')]\n", + "print(\"Found Files:\")\n", + "print(\"\\n \".join(report_files))" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "\n", + "baseline_job = my_default_monitor.latest_baselining_job\n", + "schema_df = pd.io.json.json_normalize(baseline_job.baseline_statistics().body_dict[\"features\"])\n", + "schema_df.head(10)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "constraints_df = pd.io.json.json_normalize(baseline_job.suggested_constraints().body_dict[\"features\"])\n", + "constraints_df.head(10)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## 2. Analyzing collected data for data quality issues\n", + "\n", + "When you have collected the data above, analyze and monitor the data with Monitoring Schedules" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Create a schedule" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# First, copy over some test scripts to the S3 bucket so that they can be used for pre and post processing\n", + "boto3.Session().resource('s3').Bucket(bucket).Object(code_prefix+\"/preprocessor.py\").upload_file('preprocessor.py')\n", + "boto3.Session().resource('s3').Bucket(bucket).Object(code_prefix+\"/postprocessor.py\").upload_file('postprocessor.py')" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "You can create a model monitoring schedule for the endpoint created earlier. Use the baseline resources (constraints and statistics) to compare against the realtime traffic." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.model_monitor import CronExpressionGenerator\n", + "from time import gmtime, strftime\n", + "\n", + "mon_schedule_name = 'DEMO-xgb-churn-pred-model-monitor-schedule-' + strftime(\"%Y-%m-%d-%H-%M-%S\", gmtime())\n", + "my_default_monitor.create_monitoring_schedule(\n", + " monitor_schedule_name=mon_schedule_name,\n", + " endpoint_input=predictor.endpoint,\n", + " #record_preprocessor_script=pre_processor_script,\n", + " post_analytics_processor_script=s3_code_postprocessor_uri,\n", + " output_s3_uri=s3_report_path,\n", + " statistics=my_default_monitor.baseline_statistics(),\n", + " constraints=my_default_monitor.suggested_constraints(),\n", + " schedule_cron_expression=CronExpressionGenerator.hourly(),\n", + " enable_cloudwatch_metrics=True,\n", + "\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Start generating some artificial traffic\n", + "The cell below starts a thread to send some traffic to the endpoint. Note that you need to stop the kernel to terminate this thread. If there is no traffic, the monitoring jobs are marked as `Failed` since there is no data to process." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from threading import Thread\n", + "from time import sleep\n", + "import time\n", + "\n", + "endpoint_name=predictor.endpoint\n", + "runtime_client = boto3.client('runtime.sagemaker')\n", + "\n", + "# (just repeating code from above for convenience/ able to run this section independently)\n", + "def invoke_endpoint(ep_name, file_name, runtime_client):\n", + " with open(file_name, 'r') as f:\n", + " for row in f:\n", + " payload = row.rstrip('\\n')\n", + " response = runtime_client.invoke_endpoint(EndpointName=ep_name,\n", + " ContentType='text/csv', \n", + " Body=payload)\n", + " time.sleep(1)\n", + " \n", + "def invoke_endpoint_forever():\n", + " while True:\n", + " invoke_endpoint(endpoint_name, 'test_data/test-dataset-input-cols.csv', runtime_client)\n", + " \n", + "thread = Thread(target = invoke_endpoint_forever)\n", + "thread.start()\n", + "\n", + "# Note that you need to stop the kernel to stop the invocations" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Describe and inspect the schedule\n", + "Once you describe, observe that the MonitoringScheduleStatus changes to Scheduled." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "desc_schedule_result = my_default_monitor.describe_schedule()\n", + "print('Schedule status: {}'.format(desc_schedule_result['MonitoringScheduleStatus']))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### List executions\n", + "The schedule starts jobs at the previously specified intervals. Here, you list the latest five executions. Note that if you are kicking this off after creating the hourly schedule, you might find the executions empty. You might have to wait until you cross the hour boundary (in UTC) to see executions kick off. The code below has the logic for waiting.\n", + "\n", + "Note: Even for an hourly schedule, Amazon SageMaker has a buffer period of 20 minutes to schedule your execution. You might see your execution start in anywhere from zero to ~20 minutes from the hour boundary. This is expected and done for load balancing in the backend." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "mon_executions = my_default_monitor.list_executions()\n", + "print(\"We created a hourly schedule above and it will kick off executions ON the hour (plus 0 - 20 min buffer.\\nWe will have to wait till we hit the hour...\")\n", + "\n", + "while len(mon_executions) == 0:\n", + " print(\"Waiting for the 1st execution to happen...\")\n", + " time.sleep(60)\n", + " mon_executions = my_default_monitor.list_executions() " + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Inspect a specific execution (latest execution)\n", + "In the previous cell, you picked up the latest completed or failed scheduled execution. Here are the possible terminal states and what each of them mean: \n", + "* Completed - This means the monitoring execution completed and no issues were found in the violations report.\n", + "* CompletedWithViolations - This means the execution completed, but constraint violations were detected.\n", + "* Failed - The monitoring execution failed, maybe due to client error (perhaps incorrect role premissions) or infrastructure issues. Further examination of FailureReason and ExitMessage is necessary to identify what exactly happened.\n", + "* Stopped - job exceeded max runtime or was manually stopped." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "latest_execution = mon_executions[-1] # latest execution's index is -1, second to last is -2 and so on..\n", + "time.sleep(60)\n", + "latest_execution.wait(logs=False)\n", + "\n", + "print(\"Latest execution status: {}\".format(latest_execution.describe()['ProcessingJobStatus']))\n", + "print(\"Latest execution result: {}\".format(latest_execution.describe()['ExitMessage']))\n", + "\n", + "latest_job = latest_execution.describe()\n", + "if (latest_job['ProcessingJobStatus'] != 'Completed'):\n", + " print(\"====STOP==== \\n No completed executions to inspect further. Please wait till an execution completes or investigate previously reported failures.\")" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "report_uri=latest_execution.output.destination\n", + "print('Report Uri: {}'.format(report_uri))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### List the generated reports" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from urllib.parse import urlparse\n", + "s3uri = urlparse(report_uri)\n", + "report_bucket = s3uri.netloc\n", + "report_key = s3uri.path.lstrip('/')\n", + "print('Report bucket: {}'.format(report_bucket))\n", + "print('Report key: {}'.format(report_key))\n", + "\n", + "s3_client = boto3.Session().client('s3')\n", + "result = s3_client.list_objects(Bucket=report_bucket, Prefix=report_key)\n", + "report_files = [report_file.get(\"Key\") for report_file in result.get('Contents')]\n", + "print(\"Found Report Files:\")\n", + "print(\"\\n \".join(report_files))" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Violations report" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "If there are any violations compared to the baseline, they will be listed here." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "violations = my_default_monitor.latest_monitoring_constraint_violations()\n", + "pd.set_option('display.max_colwidth', -1)\n", + "constraints_df = pd.io.json.json_normalize(violations.body_dict[\"violations\"])\n", + "constraints_df.head(10)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Other commands\n", + "We can also start and stop the monitoring schedules." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "#my_default_monitor.stop_monitoring_schedule()\n", + "#my_default_monitor.start_monitoring_schedule()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Delete the resources\n", + "\n", + "You can keep your endpoint running to continue capturing data. If you do not plan to collect more data or use this endpoint further, you should delete the endpoint to avoid incurring additional charges. Note that deleting your endpoint does not delete the data that was captured during the model invocations. That data persists in Amazon S3 until you delete it yourself.\n", + "\n", + "But before that, you need to delete the schedule first." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "my_default_monitor.delete_monitoring_schedule()\n", + "time.sleep(60) # actually wait for the deletion" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": { + "scrolled": true + }, + "outputs": [], + "source": [ + "predictor.delete_endpoint()" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "predictor.delete_model()" + ] + } + ], + "metadata": { + "anaconda-cloud": {}, + "kernelspec": { + "display_name": "conda_python3", + "language": "python", + "name": "conda_python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + }, + "notice": "Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the \"License\"). You may not use this file except in compliance with the License. A copy of the License is located at http://aws.amazon.com/apache2.0/ or in the \"license\" file accompanying this file. This file is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License." + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/sagemaker_model_monitor/introduction/model/xgb-churn-prediction-model.tar.gz b/sagemaker_model_monitor/introduction/model/xgb-churn-prediction-model.tar.gz new file mode 100644 index 0000000000..75e14e4190 Binary files /dev/null and b/sagemaker_model_monitor/introduction/model/xgb-churn-prediction-model.tar.gz differ diff --git a/sagemaker_model_monitor/introduction/postprocessor.py b/sagemaker_model_monitor/introduction/postprocessor.py new file mode 100644 index 0000000000..8c75cb5d3c --- /dev/null +++ b/sagemaker_model_monitor/introduction/postprocessor.py @@ -0,0 +1,2 @@ +def postprocess_handler(): + print("Hello from post-proc script!") \ No newline at end of file diff --git a/sagemaker_model_monitor/introduction/preprocessor.py b/sagemaker_model_monitor/introduction/preprocessor.py new file mode 100644 index 0000000000..1f14ef49f3 --- /dev/null +++ b/sagemaker_model_monitor/introduction/preprocessor.py @@ -0,0 +1,19 @@ +import json +import random + +# sample preprocess_handler (to be implemented by customer) +# This is a trivial example, where we simply generate random values +# But customers can read the data from inference_record and trasnform it into +# a flattened json structure +def preprocess_handler(inference_record): + event_data = inference_record.event_data + input_data = {} + output_data = {} + + input_data['feature0'] = random.randint(1, 3) + input_data['feature1'] = random.uniform(0, 1.6) + input_data['feature2'] = random.uniform(0, 1.6) + + output_data['prediction0'] = random.uniform(1, 30) + + return {**input_data, **output_data} \ No newline at end of file diff --git a/sagemaker_model_monitor/introduction/test_data/test-dataset-input-cols.csv b/sagemaker_model_monitor/introduction/test_data/test-dataset-input-cols.csv new file mode 100644 index 0000000000..c34e8a5d66 --- /dev/null +++ b/sagemaker_model_monitor/introduction/test_data/test-dataset-input-cols.csv @@ -0,0 +1,334 @@ +186,0.1,137.8,97,187.7,118,146.4,85,8.7,6,1,0.2,0.3,0.4,0.5,0.6,0.7,0.8,0.9,0.10,0.11,0.12,0.13,0.14,0.15,0.16,0.17,1.1,0.18,0.19,0.20,0.21,0.22,0.23,0.24,0.25,0.26,0.27,0.28,0.29,0.30,0.31,0.32,0.33,0.34,0.35,0.36,0.37,0.38,0.39,0.40,0.41,0.42,0.43,0.44,0.45,0.46,0.47,0.48,0.49,0.50,0.51,0.52,0.53,1.2,1.3,0.54,1.4,0.55 +132,25,113.2,96,269.9,107,229.1,87,7.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +112,17,183.2,95,252.8,125,156.7,95,9.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +91,24,93.5,112,183.4,128,240.7,133,9.9,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +22,0,110.3,107,166.5,93,202.3,96,9.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +102,0,186.8,92,173.7,123,250.9,131,9.7,4,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +118,21,156.5,122,209.2,125,158.7,81,11.1,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +178,35,175.4,88,190.0,65,138.7,94,10.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +107,0,234.1,91,163.1,105,282.5,100,10.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +94,0,207.0,109,167.4,80,238.2,117,2.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +86,33,253.1,112,210.1,94,95.0,98,11.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +19,0,186.1,98,254.3,57,214.0,127,14.6,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +72,0,179.9,113,149.8,112,168.2,79,9.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +199,34,230.6,121,219.4,99,299.3,94,8.0,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +146,23,149.6,96,239.8,124,293.5,135,7.4,4,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +121,34,245.0,95,216.9,66,112.4,125,7.5,8,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +85,0,102.0,95,270.2,139,148.2,105,10.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +29,0,195.6,71,126.4,74,148.6,87,14.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +123,34,305.2,80,156.5,109,280.0,81,13.2,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +57,0,161.0,113,208.0,134,208.1,81,8.4,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +92,16,184.0,99,76.4,134,185.1,96,12.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +140,0,120.3,108,240.4,84,216.4,74,7.7,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +77,0,142.3,112,306.3,111,196.5,82,9.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +126,0,122.4,88,143.8,111,157.0,106,11.5,3,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +108,15,165.1,85,267.0,93,250.7,114,10.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +70,0,222.8,114,215.9,113,223.5,122,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +113,0,193.8,99,221.4,125,172.3,67,10.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +111,0,294.7,90,294.6,72,260.1,121,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +140,0,160.5,114,240.5,103,233.5,121,11.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +109,0,217.0,115,207.0,142,268.0,106,8.2,4,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +74,0,157.1,95,213.1,36,280.4,77,7.6,3,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +96,0,200.6,117,289.5,120,98.3,95,11.2,5,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +86,0,194.2,98,193.8,95,192.0,123,9.3,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +130,0,211.2,119,231.1,120,220.9,80,6.3,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +63,0,117.1,118,249.6,90,162.2,84,11.1,4,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +79,0,268.3,114,185.5,111,264.6,88,6.3,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +123,0,128.7,126,117.6,94,198.4,132,10.8,5,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +102,0,123.1,106,182.0,102,244.6,75,12.6,7,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +27,0,177.6,121,296.8,92,192.9,106,7.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +137,0,127.0,107,323.2,75,143.9,127,7.5,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +67,31,175.2,68,199.2,73,219.8,99,13.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +130,0,252.0,101,170.2,105,209.2,64,5.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +105,0,212.0,113,226.6,128,193.6,114,8.9,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +67,30,129.6,107,233.0,104,297.0,93,14.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +195,0,63.2,108,220.2,88,184.0,99,5.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +106,0,133.7,45,187.8,107,181.9,89,10.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +76,0,90.5,142,211.7,75,194.9,76,9.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +115,26,155.2,110,230.9,133,261.6,100,4.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +95,0,156.6,88,247.6,75,192.3,115,12.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +84,12,89.7,87,138.6,73,165.8,114,10.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +156,0,174.5,65,197.4,116,238.5,86,10.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +125,0,191.6,115,205.6,108,210.2,123,9.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +128,0,245.2,112,101.5,101,152.3,116,10.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +65,0,153.9,117,220.1,122,280.5,147,8.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +209,0,255.1,124,230.6,110,218.0,69,8.5,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +90,0,179.1,71,190.6,81,127.7,91,10.6,7,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +105,0,125.4,116,261.5,95,241.6,104,11.4,9,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +61,0,260.0,123,210.5,127,234.7,70,9.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +41,0,159.3,66,125.9,75,261.9,76,11.1,5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +132,10,182.9,54,292.4,68,142.3,116,11.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +130,0,263.7,113,186.5,103,195.3,99,18.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +85,0,210.3,66,195.8,76,221.6,82,11.2,7,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +146,0,160.1,63,208.4,112,177.6,98,9.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +95,0,203.4,96,168.6,61,173.0,105,13.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +72,0,141.3,133,134.9,96,227.5,97,11.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +74,0,162.7,102,292.0,105,183.3,80,8.7,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +133,0,187.0,65,141.4,128,238.2,108,10.0,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +90,0,200.9,92,164.3,91,249.0,98,8.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +100,0,235.8,130,176.0,69,63.6,122,7.3,1,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +56,0,210.4,80,176.6,96,149.7,56,15.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +183,31,171.2,104,193.6,74,196.5,85,10.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +138,0,194.3,83,189.9,97,232.2,102,9.0,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +86,0,190.5,115,179.6,130,258.5,89,10.1,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +86,39,261.2,122,214.2,101,154.9,101,12.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +83,0,78.1,70,239.3,115,144.4,112,12.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +72,33,96.6,59,315.4,98,163.3,117,6.2,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +64,19,291.1,150,226.7,123,219.1,67,7.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +160,0,206.3,66,241.1,109,227.8,102,11.7,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +85,33,207.9,95,233.5,88,221.3,92,13.5,3,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +81,0,220.8,77,148.5,87,183.9,100,7.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +121,0,181.5,121,218.4,98,161.6,103,8.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +56,0,121.6,84,165.3,115,243.9,95,8.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +71,0,185.0,84,232.5,129,191.1,82,14.9,4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +84,0,216.1,114,197.5,107,217.8,104,9.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +130,19,152.9,87,213.2,99,205.3,114,10.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +89,19,112.6,114,261.7,132,123.5,116,11.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +43,0,251.5,105,212.8,104,157.8,67,9.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,1,0 +80,0,124.3,100,173.0,107,253.2,62,7.9,9,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +104,0,109.1,141,187.1,140,216.6,100,10.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +111,0,78.3,119,198.2,94,248.5,94,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +116,0,63.7,101,195.8,95,210.1,87,10.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +71,23,175.7,82,258.9,136,268.4,154,14.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +10,0,222.2,127,153.1,125,227.4,80,12.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +106,0,169.4,107,197.2,71,202.2,79,10.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +85,0,222.3,132,231.5,101,223.5,75,11.0,2,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +149,0,237.6,79,192.4,107,207.4,111,9.1,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +182,0,279.5,118,203.2,113,174.2,101,10.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +120,0,149.2,98,193.6,88,248.9,119,11.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +92,0,265.6,82,180.7,75,211.1,113,8.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +116,24,232.9,90,152.1,94,344.3,82,10.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +133,0,201.7,85,169.4,116,286.3,80,6.0,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +81,0,145.4,132,129.3,91,186.4,109,5.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +97,0,217.6,81,320.5,51,150.7,110,4.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +171,0,137.5,110,198.1,109,292.7,131,13.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +104,0,130.5,77,131.2,117,264.7,63,13.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +50,0,295.3,127,127.4,100,166.8,105,9.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +99,0,254.4,120,159.3,92,264.4,94,6.0,5,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +16,0,209.5,89,172.8,85,94.1,102,8.8,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +11,24,131.5,98,230.2,111,283.7,87,10.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +120,0,185.7,133,235.1,149,256.4,78,16.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +140,28,157.1,77,172.4,97,184.5,94,11.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +108,0,239.3,102,223.4,127,251.4,104,10.6,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +51,0,214.8,94,149.7,58,283.4,66,10.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +15,0,121.1,130,216.0,86,235.1,33,16.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +67,0,109.1,117,217.4,124,188.4,141,12.8,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +68,0,159.5,123,240.8,93,210.3,76,11.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +23,31,156.6,84,161.5,96,294.6,107,9.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +92,0,176.3,85,93.4,125,207.2,107,9.6,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +138,0,46.5,104,186.0,114,167.5,95,9.6,4,4,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +93,0,176.1,103,199.7,130,263.9,96,8.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +59,0,150.2,70,185.7,98,212.5,128,12.1,2,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +106,0,207.9,91,172.0,109,191.8,143,14.4,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +95,39,260.8,130,213.4,111,195.6,97,10.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +80,0,209.9,74,195.1,77,208.2,119,8.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +143,33,141.4,130,186.4,114,210.0,111,7.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +30,0,227.4,88,182.5,100,191.7,134,12.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +78,0,191.7,122,241.4,88,203.5,86,9.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +103,35,110.5,101,208.3,81,87.4,77,13.9,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +177,0,248.7,118,172.3,73,191.9,87,11.3,2,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +57,25,176.8,94,195.0,75,213.5,116,8.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +116,0,160.7,69,146.8,106,287.8,144,8.2,5,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +151,0,118.9,128,278.3,65,194.8,61,13.2,10,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +163,0,197.2,90,188.5,113,211.1,94,7.8,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +78,0,87.0,102,193.6,64,205.8,120,11.0,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +52,0,217.0,104,152.3,83,134.3,109,11.8,4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +80,22,196.4,115,150.3,109,176.2,75,9.3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +175,0,132.0,95,231.2,74,313.4,108,8.7,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +108,0,112.0,105,193.7,110,208.9,93,4.1,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +127,0,96.0,117,177.0,68,162.2,127,9.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +52,0,209.8,114,171.3,82,154.6,119,9.9,9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +130,0,127.0,102,206.9,107,231.7,99,6.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +147,0,155.1,117,239.7,93,208.8,133,10.6,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +110,0,74.5,117,200.8,98,192.2,101,9.8,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +110,27,267.9,103,263.3,74,178.1,106,8.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +94,0,139.4,95,159.1,92,128.2,129,7.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +99,28,200.7,88,264.2,116,172.7,102,9.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,1 +180,33,231.8,78,232.9,79,206.9,121,7.6,4,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +36,0,177.9,129,224.6,87,306.3,102,10.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +86,41,119.0,101,230.0,134,236.9,58,9.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +130,0,242.5,101,102.8,114,142.4,89,9.3,2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +111,13,193.1,104,111.6,98,227.4,94,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +92,0,197.2,113,242.3,116,192.0,76,11.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +82,0,185.8,36,276.5,134,192.1,104,5.7,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +56,30,127.1,89,172.1,116,194.6,111,12.1,3,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +97,0,256.4,125,273.9,100,222.7,101,11.1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +120,0,178.4,97,168.3,113,120.5,93,9.3,9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +91,0,153.0,123,141.1,127,171.5,76,10.3,15,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +90,0,175.9,111,285.2,115,150.8,122,13.0,7,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +93,0,328.1,106,151.7,89,303.5,114,8.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +85,0,235.8,109,157.2,94,188.2,99,12.0,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +48,0,171.9,98,159.0,127,139.5,101,7.6,3,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +93,42,152.3,90,267.5,102,266.9,130,11.3,5,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +76,33,189.7,66,212.8,65,165.7,108,10.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,1 +94,0,136.2,114,165.1,118,137.9,71,9.6,5,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +120,43,177.9,117,175.1,70,161.3,117,11.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +232,0,165.6,104,195.9,115,118.3,77,11.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +137,0,109.8,120,230.5,86,255.8,103,11.9,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +130,0,155.9,95,256.1,97,262.9,103,11.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +104,0,200.2,92,118.7,87,236.6,65,6.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +90,24,71.2,82,181.6,103,186.9,111,12.9,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +123,22,197.6,105,80.0,86,120.8,82,15.6,12,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +63,0,132.9,122,67.0,62,160.4,121,9.9,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +59,0,159.5,96,167.2,123,138.6,106,10.2,4,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +131,0,153.4,86,198.5,81,164.4,83,10.4,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +12,0,249.6,118,252.4,119,280.2,90,11.8,3,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +126,0,321.3,99,167.9,93,193.6,106,8.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +114,0,203.8,85,87.8,110,166.2,122,11.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +108,0,198.5,99,267.8,60,354.9,75,9.4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +71,0,103.3,103,138.5,79,164.8,98,9.0,2,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +113,32,180.4,89,129.4,124,166.9,124,8.4,2,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +121,0,170.4,108,350.5,68,297.0,87,11.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +66,33,88.8,104,109.6,94,172.7,107,7.1,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +134,0,205.3,122,240.5,155,179.1,107,5.0,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +105,0,273.9,119,278.6,103,255.3,90,10.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +36,29,281.4,102,202.2,76,187.2,113,9.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +63,0,261.8,69,245.0,135,202.1,94,14.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +86,31,167.6,139,113.0,118,246.9,121,12.2,6,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +126,0,161.4,110,220.6,125,249.2,78,5.1,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +36,0,178.6,83,213.1,103,198.0,119,10.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +136,16,90.4,105,201.3,109,227.1,115,13.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,0,1 +42,0,196.5,89,241.3,123,143.2,105,4.0,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +77,0,163.0,112,219.1,89,233.4,66,6.7,3,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +121,0,134.1,112,195.1,104,159.6,139,10.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +171,17,186.9,94,240.0,138,200.9,64,5.8,3,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +21,0,225.0,110,244.2,111,221.2,93,10.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +113,0,132.1,72,247.5,107,246.2,123,6.9,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +71,0,258.4,132,126.8,119,182.4,87,9.7,8,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +161,0,297.9,141,238.1,107,240.5,93,8.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +189,0,219.9,80,143.3,117,130.6,69,11.7,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +113,0,204.3,82,188.8,115,139.4,97,9.2,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +182,24,128.1,104,143.4,127,191.0,98,11.6,3,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +125,32,96.5,109,145.8,109,174.4,82,9.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +45,0,96.1,103,246.8,134,229.7,92,9.7,4,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +125,39,236.1,107,289.2,110,175.4,107,9.1,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +102,0,114.8,125,81.9,126,304.3,101,12.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +94,0,174.0,85,241.1,114,207.8,94,7.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +74,35,154.1,104,123.4,84,202.1,57,10.9,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +116,44,230.6,94,224.1,103,244.0,76,11.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +16,0,110.0,91,147.3,75,190.5,73,6.4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +152,0,317.8,60,152.9,100,123.4,63,10.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +127,0,143.2,60,179.5,159,171.8,122,6.2,4,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +69,0,196.1,87,236.8,66,182.3,75,11.9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +173,21,232.4,96,211.9,118,273.0,102,5.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +64,33,127.2,93,162.9,104,247.4,109,8.1,13,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +75,0,226.1,105,201.5,107,246.2,98,10.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +81,0,115.3,99,224.7,117,152.5,98,18.0,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +75,26,118.5,86,213.9,118,132.6,99,13.4,3,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +124,0,143.3,120,230.7,111,214.3,91,7.8,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +95,0,69.4,79,190.8,109,219.9,102,8.9,5,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +114,28,225.8,94,193.0,117,232.4,100,8.4,9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +135,0,263.8,66,251.3,116,200.1,112,8.4,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +105,20,186.9,114,256.3,91,334.7,104,8.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +42,0,92.2,108,211.2,120,129.1,73,13.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +59,27,127.4,110,103.3,99,164.2,73,9.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +112,22,181.8,110,228.1,123,262.7,141,9.2,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +81,0,198.4,93,210.9,108,193.3,71,10.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +104,0,97.2,88,155.6,85,261.6,105,12.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +77,0,245.2,87,254.1,83,239.4,91,7.5,4,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +17,0,162.8,118,229.6,91,332.7,94,13.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +153,0,159.5,103,275.5,90,176.7,126,10.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +22,0,160.4,108,218.1,88,192.9,115,12.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +135,0,173.4,107,222.0,84,64.2,94,13.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +149,0,140.4,94,271.8,92,188.3,108,11.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +155,0,71.2,90,304.4,119,183.3,103,8.6,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +25,0,242.6,69,209.0,117,219.7,82,14.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +174,33,167.8,91,205.3,91,130.0,132,14.5,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,0,1 +147,35,157.5,109,189.6,67,227.0,76,11.1,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +74,0,177.4,136,240.3,104,237.3,133,12.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +101,0,206.6,105,224.9,117,249.9,100,14.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +127,0,134.9,79,221.5,114,113.8,118,15.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +86,16,144.8,105,206.2,111,255.4,117,11.6,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +102,31,125.3,92,141.2,108,168.2,68,6.3,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +99,0,174.1,102,99.1,118,211.6,126,7.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +105,23,193.5,85,220.2,90,272.4,111,8.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +12,0,204.6,98,212.5,90,182.1,95,9.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +130,0,162.8,113,290.3,111,114.9,140,7.2,3,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +150,27,209.8,112,155.0,80,251.5,111,7.2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +70,0,170.2,98,155.2,102,228.6,76,15.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +99,42,216.0,125,232.3,104,215.5,100,9.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +155,26,211.7,121,139.2,123,146.7,89,11.1,3,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +152,0,228.1,93,136.4,106,197.3,107,9.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +110,0,100.1,90,233.3,93,204.4,57,11.1,8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +98,0,165.0,129,202.6,113,172.3,94,12.5,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +91,44,216.6,101,173.1,98,242.1,95,9.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1 +74,25,234.4,113,265.9,82,241.4,77,13.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +113,0,207.2,113,256.0,80,211.0,87,9.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +116,0,217.3,91,216.1,95,148.1,76,11.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +51,0,197.8,60,221.0,64,168.6,134,8.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +66,0,87.6,76,262.0,111,184.6,125,9.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +128,32,222.9,136,262.0,80,191.4,101,10.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,1 +152,0,141.5,102,263.0,94,207.1,113,3.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +73,0,94.9,121,253.2,83,175.1,86,14.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +143,22,141.8,116,167.3,99,178.1,130,7.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +73,26,131.2,98,106.5,97,221.7,96,10.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +108,35,215.9,106,200.6,107,195.4,107,15.5,7,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +122,0,119.3,93,223.9,103,211.9,122,8.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +47,28,112.2,70,154.8,106,166.7,105,10.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +157,0,229.8,90,147.9,121,241.4,108,9.6,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +128,0,237.9,125,247.6,93,208.9,68,13.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +140,0,112.8,89,156.7,65,249.6,85,16.3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +79,17,236.7,95,263.5,56,259.6,107,12.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +98,0,271.4,119,190.4,102,284.7,118,11.1,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +36,42,196.8,89,254.9,122,138.3,126,20.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +119,0,231.5,82,266.9,97,211.0,118,7.4,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +111,36,166.2,54,238.8,109,108.8,92,11.2,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +134,0,141.7,95,205.6,101,218.5,60,8.8,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +118,0,224.6,94,225.9,120,269.0,105,12.5,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +148,0,202.0,102,243.2,128,261.3,90,10.9,3,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +62,0,189.5,122,103.8,95,180.6,106,10.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +139,0,181.6,119,335.7,118,149.8,64,8.3,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +86,16,145.7,88,191.0,129,215.5,82,11.3,7,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +112,0,81.6,94,268.1,112,140.8,75,8.6,18,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +120,0,221.3,106,267.6,98,111.5,80,9.3,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +82,0,265.2,122,178.7,102,174.7,90,10.7,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +46,0,124.8,133,157.3,143,199.3,72,8.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +105,0,270.9,98,226.2,110,178.8,60,8.8,5,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +95,0,174.0,57,281.1,118,197.2,94,9.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +88,0,131.5,99,174.8,128,184.2,83,7.9,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +53,0,168.8,97,220.3,87,154.3,113,10.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +106,0,212.9,110,187.0,69,128.1,71,6.3,3,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +72,0,137.6,106,143.5,94,273.7,110,9.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +129,0,161.3,122,220.6,95,224.7,104,9.6,3,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +94,0,118.7,90,205.1,57,172.2,100,10.4,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +46,0,257.4,67,261.1,91,204.4,107,13.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +115,26,170.5,107,217.2,77,225.7,71,13.6,5,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +36,0,117.1,94,235.4,117,221.3,108,9.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +157,0,180.4,123,194.0,98,227.3,88,8.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,1,0 +121,0,168.9,128,123.9,99,266.3,105,2.9,7,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +112,0,208.7,150,212.8,104,178.1,98,8.5,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +64,0,168.0,116,192.4,94,166.5,98,10.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +121,41,215.5,95,241.8,92,147.0,108,9.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +136,35,205.5,86,298.5,119,214.2,104,6.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +111,36,96.8,123,170.6,105,166.0,85,13.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +149,28,126.9,97,166.9,102,145.2,77,8.8,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1 +100,0,216.2,107,215.6,84,138.4,127,10.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +87,30,262.8,114,215.8,130,154.8,88,7.8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +113,0,215.5,129,218.7,117,207.1,91,6.6,9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +108,0,138.6,122,172.3,117,231.6,92,9.8,3,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +102,0,234.8,125,199.2,99,163.2,88,10.0,1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +97,0,151.6,107,155.4,96,240.0,112,14.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +174,15,221.8,143,210.6,115,221.8,109,12.4,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +121,31,263.1,70,279.3,118,127.1,143,9.7,4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +87,0,110.9,91,158.5,115,207.5,131,6.2,5,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +132,0,83.4,110,232.2,137,146.7,114,7.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +89,0,179.7,128,299.8,92,185.3,120,7.6,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +119,0,222.8,122,163.2,107,160.6,112,11.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +118,0,154.6,112,184.2,105,217.4,102,12.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +70,0,226.7,98,228.1,115,73.2,93,17.6,4,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +131,0,122.3,83,118.8,94,147.9,95,13.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +78,0,225.1,67,199.2,127,175.5,102,14.6,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +45,0,78.2,127,253.4,108,255.0,100,18.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +74,0,106.4,84,140.2,104,90.9,81,11.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +107,27,283.4,104,224.1,152,241.3,63,14.4,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,1 +79,0,222.3,99,146.2,82,275.6,82,8.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +80,31,166.4,92,238.3,74,150.7,84,10.7,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +25,0,264.9,80,281.2,66,166.1,80,8.4,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +39,0,60.4,158,306.2,120,123.9,46,12.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +135,0,155.2,100,135.9,84,184.6,82,3.8,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +119,0,260.1,101,256.5,68,229.1,89,10.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +59,0,166.3,95,239.3,87,123.2,108,10.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +127,14,143.2,99,169.9,91,221.6,77,11.6,1,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +86,0,166.2,112,255.3,81,228.1,97,5.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +36,43,29.9,123,129.1,117,325.9,105,8.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +123,0,163.1,119,249.4,51,168.2,77,9.0,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 diff --git a/sagemaker_model_monitor/introduction/test_data/test-dataset.csv b/sagemaker_model_monitor/introduction/test_data/test-dataset.csv new file mode 100644 index 0000000000..f5c57bebb8 --- /dev/null +++ b/sagemaker_model_monitor/introduction/test_data/test-dataset.csv @@ -0,0 +1,334 @@ +0,186,0,137.8,97,187.7,118,146.4,85,8.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,132,25,113.2,96,269.9,107,229.1,87,7.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,112,17,183.2,95,252.8,125,156.7,95,9.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,91,24,93.5,112,183.4,128,240.7,133,9.9,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,22,0,110.3,107,166.5,93,202.3,96,9.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,102,0,186.8,92,173.7,123,250.9,131,9.7,4,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,118,21,156.5,122,209.2,125,158.7,81,11.1,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,178,35,175.4,88,190.0,65,138.7,94,10.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,107,0,234.1,91,163.1,105,282.5,100,10.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,94,0,207.0,109,167.4,80,238.2,117,2.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,86,33,253.1,112,210.1,94,95.0,98,11.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,19,0,186.1,98,254.3,57,214.0,127,14.6,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,72,0,179.9,113,149.8,112,168.2,79,9.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,199,34,230.6,121,219.4,99,299.3,94,8.0,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,146,23,149.6,96,239.8,124,293.5,135,7.4,4,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,121,34,245.0,95,216.9,66,112.4,125,7.5,8,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,85,0,102.0,95,270.2,139,148.2,105,10.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,29,0,195.6,71,126.4,74,148.6,87,14.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,123,34,305.2,80,156.5,109,280.0,81,13.2,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,57,0,161.0,113,208.0,134,208.1,81,8.4,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,92,16,184.0,99,76.4,134,185.1,96,12.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +1,140,0,120.3,108,240.4,84,216.4,74,7.7,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,77,0,142.3,112,306.3,111,196.5,82,9.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,126,0,122.4,88,143.8,111,157.0,106,11.5,3,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,108,15,165.1,85,267.0,93,250.7,114,10.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,70,0,222.8,114,215.9,113,223.5,122,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,113,0,193.8,99,221.4,125,172.3,67,10.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,111,0,294.7,90,294.6,72,260.1,121,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,140,0,160.5,114,240.5,103,233.5,121,11.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,0,217.0,115,207.0,142,268.0,106,8.2,4,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,74,0,157.1,95,213.1,36,280.4,77,7.6,3,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,96,0,200.6,117,289.5,120,98.3,95,11.2,5,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,86,0,194.2,98,193.8,95,192.0,123,9.3,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,130,0,211.2,119,231.1,120,220.9,80,6.3,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,63,0,117.1,118,249.6,90,162.2,84,11.1,4,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,79,0,268.3,114,185.5,111,264.6,88,6.3,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,128.7,126,117.6,94,198.4,132,10.8,5,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,102,0,123.1,106,182.0,102,244.6,75,12.6,7,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,27,0,177.6,121,296.8,92,192.9,106,7.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,137,0,127.0,107,323.2,75,143.9,127,7.5,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,31,175.2,68,199.2,73,219.8,99,13.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,130,0,252.0,101,170.2,105,209.2,64,5.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,0,212.0,113,226.6,128,193.6,114,8.9,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,67,30,129.6,107,233.0,104,297.0,93,14.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,195,0,63.2,108,220.2,88,184.0,99,5.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,106,0,133.7,45,187.8,107,181.9,89,10.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,76,0,90.5,142,211.7,75,194.9,76,9.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,115,26,155.2,110,230.9,133,261.6,100,4.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,95,0,156.6,88,247.6,75,192.3,115,12.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,84,12,89.7,87,138.6,73,165.8,114,10.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +0,156,0,174.5,65,197.4,116,238.5,86,10.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,191.6,115,205.6,108,210.2,123,9.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,128,0,245.2,112,101.5,101,152.3,116,10.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,65,0,153.9,117,220.1,122,280.5,147,8.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,209,0,255.1,124,230.6,110,218.0,69,8.5,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,90,0,179.1,71,190.6,81,127.7,91,10.6,7,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,0,125.4,116,261.5,95,241.6,104,11.4,9,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,61,0,260.0,123,210.5,127,234.7,70,9.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,41,0,159.3,66,125.9,75,261.9,76,11.1,5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,10,182.9,54,292.4,68,142.3,116,11.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,130,0,263.7,113,186.5,103,195.3,99,18.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,210.3,66,195.8,76,221.6,82,11.2,7,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,0,160.1,63,208.4,112,177.6,98,9.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,95,0,203.4,96,168.6,61,173.0,105,13.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,72,0,141.3,133,134.9,96,227.5,97,11.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,74,0,162.7,102,292.0,105,183.3,80,8.7,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,133,0,187.0,65,141.4,128,238.2,108,10.0,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,90,0,200.9,92,164.3,91,249.0,98,8.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,100,0,235.8,130,176.0,69,63.6,122,7.3,1,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,56,0,210.4,80,176.6,96,149.7,56,15.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,183,31,171.2,104,193.6,74,196.5,85,10.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,138,0,194.3,83,189.9,97,232.2,102,9.0,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,190.5,115,179.6,130,258.5,89,10.1,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,86,39,261.2,122,214.2,101,154.9,101,12.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,83,0,78.1,70,239.3,115,144.4,112,12.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,72,33,96.6,59,315.4,98,163.3,117,6.2,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,64,19,291.1,150,226.7,123,219.1,67,7.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,160,0,206.3,66,241.1,109,227.8,102,11.7,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,33,207.9,95,233.5,88,221.3,92,13.5,3,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,81,0,220.8,77,148.5,87,183.9,100,7.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,0,181.5,121,218.4,98,161.6,103,8.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,56,0,121.6,84,165.3,115,243.9,95,8.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,71,0,185.0,84,232.5,129,191.1,82,14.9,4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,84,0,216.1,114,197.5,107,217.8,104,9.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,130,19,152.9,87,213.2,99,205.3,114,10.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,89,19,112.6,114,261.7,132,123.5,116,11.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,43,0,251.5,105,212.8,104,157.8,67,9.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,1,0 +0,80,0,124.3,100,173.0,107,253.2,62,7.9,9,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,104,0,109.1,141,187.1,140,216.6,100,10.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,111,0,78.3,119,198.2,94,248.5,94,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,116,0,63.7,101,195.8,95,210.1,87,10.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,71,23,175.7,82,258.9,136,268.4,154,14.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,10,0,222.2,127,153.1,125,227.4,80,12.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,169.4,107,197.2,71,202.2,79,10.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,85,0,222.3,132,231.5,101,223.5,75,11.0,2,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,149,0,237.6,79,192.4,107,207.4,111,9.1,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,182,0,279.5,118,203.2,113,174.2,101,10.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,0,149.2,98,193.6,88,248.9,119,11.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,92,0,265.6,82,180.7,75,211.1,113,8.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,24,232.9,90,152.1,94,344.3,82,10.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,133,0,201.7,85,169.4,116,286.3,80,6.0,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,81,0,145.4,132,129.3,91,186.4,109,5.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,0,217.6,81,320.5,51,150.7,110,4.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,171,0,137.5,110,198.1,109,292.7,131,13.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,104,0,130.5,77,131.2,117,264.7,63,13.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,50,0,295.3,127,127.4,100,166.8,105,9.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,99,0,254.4,120,159.3,92,264.4,94,6.0,5,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,16,0,209.5,89,172.8,85,94.1,102,8.8,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,11,24,131.5,98,230.2,111,283.7,87,10.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +0,120,0,185.7,133,235.1,149,256.4,78,16.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,140,28,157.1,77,172.4,97,184.5,94,11.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,108,0,239.3,102,223.4,127,251.4,104,10.6,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,0,214.8,94,149.7,58,283.4,66,10.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,15,0,121.1,130,216.0,86,235.1,33,16.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,0,109.1,117,217.4,124,188.4,141,12.8,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,68,0,159.5,123,240.8,93,210.3,76,11.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,23,31,156.6,84,161.5,96,294.6,107,9.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,92,0,176.3,85,93.4,125,207.2,107,9.6,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,138,0,46.5,104,186.0,114,167.5,95,9.6,4,4,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,93,0,176.1,103,199.7,130,263.9,96,8.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,0,150.2,70,185.7,98,212.5,128,12.1,2,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,106,0,207.9,91,172.0,109,191.8,143,14.4,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,39,260.8,130,213.4,111,195.6,97,10.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,80,0,209.9,74,195.1,77,208.2,119,8.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,143,33,141.4,130,186.4,114,210.0,111,7.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,30,0,227.4,88,182.5,100,191.7,134,12.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,78,0,191.7,122,241.4,88,203.5,86,9.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,103,35,110.5,101,208.3,81,87.4,77,13.9,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,177,0,248.7,118,172.3,73,191.9,87,11.3,2,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,57,25,176.8,94,195.0,75,213.5,116,8.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,116,0,160.7,69,146.8,106,287.8,144,8.2,5,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,151,0,118.9,128,278.3,65,194.8,61,13.2,10,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,163,0,197.2,90,188.5,113,211.1,94,7.8,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,78,0,87.0,102,193.6,64,205.8,120,11.0,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,52,0,217.0,104,152.3,83,134.3,109,11.8,4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,80,22,196.4,115,150.3,109,176.2,75,9.3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,175,0,132.0,95,231.2,74,313.4,108,8.7,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,108,0,112.0,105,193.7,110,208.9,93,4.1,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,0,96.0,117,177.0,68,162.2,127,9.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,52,0,209.8,114,171.3,82,154.6,119,9.9,9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,130,0,127.0,102,206.9,107,231.7,99,6.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,147,0,155.1,117,239.7,93,208.8,133,10.6,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,74.5,117,200.8,98,192.2,101,9.8,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,110,27,267.9,103,263.3,74,178.1,106,8.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,94,0,139.4,95,159.1,92,128.2,129,7.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,99,28,200.7,88,264.2,116,172.7,102,9.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,1 +0,180,33,231.8,78,232.9,79,206.9,121,7.6,4,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,36,0,177.9,129,224.6,87,306.3,102,10.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,86,41,119.0,101,230.0,134,236.9,58,9.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,130,0,242.5,101,102.8,114,142.4,89,9.3,2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,111,13,193.1,104,111.6,98,227.4,94,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,92,0,197.2,113,242.3,116,192.0,76,11.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,82,0,185.8,36,276.5,134,192.1,104,5.7,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,56,30,127.1,89,172.1,116,194.6,111,12.1,3,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,97,0,256.4,125,273.9,100,222.7,101,11.1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,120,0,178.4,97,168.3,113,120.5,93,9.3,9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,91,0,153.0,123,141.1,127,171.5,76,10.3,15,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,175.9,111,285.2,115,150.8,122,13.0,7,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,93,0,328.1,106,151.7,89,303.5,114,8.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,85,0,235.8,109,157.2,94,188.2,99,12.0,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,48,0,171.9,98,159.0,127,139.5,101,7.6,3,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,42,152.3,90,267.5,102,266.9,130,11.3,5,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,76,33,189.7,66,212.8,65,165.7,108,10.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,1 +0,94,0,136.2,114,165.1,118,137.9,71,9.6,5,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,120,43,177.9,117,175.1,70,161.3,117,11.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,232,0,165.6,104,195.9,115,118.3,77,11.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,137,0,109.8,120,230.5,86,255.8,103,11.9,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,130,0,155.9,95,256.1,97,262.9,103,11.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,104,0,200.2,92,118.7,87,236.6,65,6.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,90,24,71.2,82,181.6,103,186.9,111,12.9,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,123,22,197.6,105,80.0,86,120.8,82,15.6,12,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,63,0,132.9,122,67.0,62,160.4,121,9.9,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,59,0,159.5,96,167.2,123,138.6,106,10.2,4,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,131,0,153.4,86,198.5,81,164.4,83,10.4,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,12,0,249.6,118,252.4,119,280.2,90,11.8,3,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,126,0,321.3,99,167.9,93,193.6,106,8.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,114,0,203.8,85,87.8,110,166.2,122,11.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,108,0,198.5,99,267.8,60,354.9,75,9.4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,71,0,103.3,103,138.5,79,164.8,98,9.0,2,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,113,32,180.4,89,129.4,124,166.9,124,8.4,2,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,121,0,170.4,108,350.5,68,297.0,87,11.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,66,33,88.8,104,109.6,94,172.7,107,7.1,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,134,0,205.3,122,240.5,155,179.1,107,5.0,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,105,0,273.9,119,278.6,103,255.3,90,10.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,36,29,281.4,102,202.2,76,187.2,113,9.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,63,0,261.8,69,245.0,135,202.1,94,14.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,86,31,167.6,139,113.0,118,246.9,121,12.2,6,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,126,0,161.4,110,220.6,125,249.2,78,5.1,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,36,0,178.6,83,213.1,103,198.0,119,10.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,136,16,90.4,105,201.3,109,227.1,115,13.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,0,1 +0,42,0,196.5,89,241.3,123,143.2,105,4.0,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,77,0,163.0,112,219.1,89,233.4,66,6.7,3,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,121,0,134.1,112,195.1,104,159.6,139,10.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,171,17,186.9,94,240.0,138,200.9,64,5.8,3,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,21,0,225.0,110,244.2,111,221.2,93,10.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,113,0,132.1,72,247.5,107,246.2,123,6.9,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,71,0,258.4,132,126.8,119,182.4,87,9.7,8,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,161,0,297.9,141,238.1,107,240.5,93,8.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,189,0,219.9,80,143.3,117,130.6,69,11.7,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,113,0,204.3,82,188.8,115,139.4,97,9.2,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,182,24,128.1,104,143.4,127,191.0,98,11.6,3,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,125,32,96.5,109,145.8,109,174.4,82,9.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,45,0,96.1,103,246.8,134,229.7,92,9.7,4,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,39,236.1,107,289.2,110,175.4,107,9.1,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,102,0,114.8,125,81.9,126,304.3,101,12.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,174.0,85,241.1,114,207.8,94,7.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,74,35,154.1,104,123.4,84,202.1,57,10.9,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,116,44,230.6,94,224.1,103,244.0,76,11.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,16,0,110.0,91,147.3,75,190.5,73,6.4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,152,0,317.8,60,152.9,100,123.4,63,10.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,127,0,143.2,60,179.5,159,171.8,122,6.2,4,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,69,0,196.1,87,236.8,66,182.3,75,11.9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,173,21,232.4,96,211.9,118,273.0,102,5.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,64,33,127.2,93,162.9,104,247.4,109,8.1,13,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,75,0,226.1,105,201.5,107,246.2,98,10.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,81,0,115.3,99,224.7,117,152.5,98,18.0,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,75,26,118.5,86,213.9,118,132.6,99,13.4,3,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,124,0,143.3,120,230.7,111,214.3,91,7.8,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,95,0,69.4,79,190.8,109,219.9,102,8.9,5,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,114,28,225.8,94,193.0,117,232.4,100,8.4,9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,135,0,263.8,66,251.3,116,200.1,112,8.4,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,20,186.9,114,256.3,91,334.7,104,8.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,42,0,92.2,108,211.2,120,129.1,73,13.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,27,127.4,110,103.3,99,164.2,73,9.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,112,22,181.8,110,228.1,123,262.7,141,9.2,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,81,0,198.4,93,210.9,108,193.3,71,10.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,104,0,97.2,88,155.6,85,261.6,105,12.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,77,0,245.2,87,254.1,83,239.4,91,7.5,4,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,17,0,162.8,118,229.6,91,332.7,94,13.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,153,0,159.5,103,275.5,90,176.7,126,10.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,22,0,160.4,108,218.1,88,192.9,115,12.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,135,0,173.4,107,222.0,84,64.2,94,13.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,149,0,140.4,94,271.8,92,188.3,108,11.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,155,0,71.2,90,304.4,119,183.3,103,8.6,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,25,0,242.6,69,209.0,117,219.7,82,14.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,174,33,167.8,91,205.3,91,130.0,132,14.5,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,0,1 +1,147,35,157.5,109,189.6,67,227.0,76,11.1,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,74,0,177.4,136,240.3,104,237.3,133,12.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,101,0,206.6,105,224.9,117,249.9,100,14.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,0,134.9,79,221.5,114,113.8,118,15.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,86,16,144.8,105,206.2,111,255.4,117,11.6,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,102,31,125.3,92,141.2,108,168.2,68,6.3,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,99,0,174.1,102,99.1,118,211.6,126,7.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,23,193.5,85,220.2,90,272.4,111,8.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,12,0,204.6,98,212.5,90,182.1,95,9.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,130,0,162.8,113,290.3,111,114.9,140,7.2,3,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,150,27,209.8,112,155.0,80,251.5,111,7.2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,70,0,170.2,98,155.2,102,228.6,76,15.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,99,42,216.0,125,232.3,104,215.5,100,9.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +0,155,26,211.7,121,139.2,123,146.7,89,11.1,3,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,152,0,228.1,93,136.4,106,197.3,107,9.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,110,0,100.1,90,233.3,93,204.4,57,11.1,8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,98,0,165.0,129,202.6,113,172.3,94,12.5,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,91,44,216.6,101,173.1,98,242.1,95,9.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1 +0,74,25,234.4,113,265.9,82,241.4,77,13.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,113,0,207.2,113,256.0,80,211.0,87,9.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,116,0,217.3,91,216.1,95,148.1,76,11.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,0,197.8,60,221.0,64,168.6,134,8.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,66,0,87.6,76,262.0,111,184.6,125,9.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,128,32,222.9,136,262.0,80,191.4,101,10.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,1 +0,152,0,141.5,102,263.0,94,207.1,113,3.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,73,0,94.9,121,253.2,83,175.1,86,14.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,143,22,141.8,116,167.3,99,178.1,130,7.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,73,26,131.2,98,106.5,97,221.7,96,10.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,108,35,215.9,106,200.6,107,195.4,107,15.5,7,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,122,0,119.3,93,223.9,103,211.9,122,8.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,47,28,112.2,70,154.8,106,166.7,105,10.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,157,0,229.8,90,147.9,121,241.4,108,9.6,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,128,0,237.9,125,247.6,93,208.9,68,13.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,140,0,112.8,89,156.7,65,249.6,85,16.3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,17,236.7,95,263.5,56,259.6,107,12.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,98,0,271.4,119,190.4,102,284.7,118,11.1,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,36,42,196.8,89,254.9,122,138.3,126,20.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,119,0,231.5,82,266.9,97,211.0,118,7.4,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,111,36,166.2,54,238.8,109,108.8,92,11.2,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,134,0,141.7,95,205.6,101,218.5,60,8.8,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,224.6,94,225.9,120,269.0,105,12.5,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,0,202.0,102,243.2,128,261.3,90,10.9,3,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,62,0,189.5,122,103.8,95,180.6,106,10.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,139,0,181.6,119,335.7,118,149.8,64,8.3,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,16,145.7,88,191.0,129,215.5,82,11.3,7,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,112,0,81.6,94,268.1,112,140.8,75,8.6,18,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,0,221.3,106,267.6,98,111.5,80,9.3,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,82,0,265.2,122,178.7,102,174.7,90,10.7,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,46,0,124.8,133,157.3,143,199.3,72,8.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,105,0,270.9,98,226.2,110,178.8,60,8.8,5,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,95,0,174.0,57,281.1,118,197.2,94,9.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,88,0,131.5,99,174.8,128,184.2,83,7.9,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,53,0,168.8,97,220.3,87,154.3,113,10.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,212.9,110,187.0,69,128.1,71,6.3,3,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,72,0,137.6,106,143.5,94,273.7,110,9.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,0,161.3,122,220.6,95,224.7,104,9.6,3,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,94,0,118.7,90,205.1,57,172.2,100,10.4,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +1,46,0,257.4,67,261.1,91,204.4,107,13.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,115,26,170.5,107,217.2,77,225.7,71,13.6,5,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,36,0,117.1,94,235.4,117,221.3,108,9.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,157,0,180.4,123,194.0,98,227.3,88,8.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,1,0 +0,121,0,168.9,128,123.9,99,266.3,105,2.9,7,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,208.7,150,212.8,104,178.1,98,8.5,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,168.0,116,192.4,94,166.5,98,10.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,121,41,215.5,95,241.8,92,147.0,108,9.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,136,35,205.5,86,298.5,119,214.2,104,6.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +0,111,36,96.8,123,170.6,105,166.0,85,13.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,149,28,126.9,97,166.9,102,145.2,77,8.8,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1 +0,100,0,216.2,107,215.6,84,138.4,127,10.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,87,30,262.8,114,215.8,130,154.8,88,7.8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,113,0,215.5,129,218.7,117,207.1,91,6.6,9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,0,138.6,122,172.3,117,231.6,92,9.8,3,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,102,0,234.8,125,199.2,99,163.2,88,10.0,1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,97,0,151.6,107,155.4,96,240.0,112,14.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,174,15,221.8,143,210.6,115,221.8,109,12.4,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,121,31,263.1,70,279.3,118,127.1,143,9.7,4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,87,0,110.9,91,158.5,115,207.5,131,6.2,5,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,132,0,83.4,110,232.2,137,146.7,114,7.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,89,0,179.7,128,299.8,92,185.3,120,7.6,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,119,0,222.8,122,163.2,107,160.6,112,11.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,118,0,154.6,112,184.2,105,217.4,102,12.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,70,0,226.7,98,228.1,115,73.2,93,17.6,4,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,131,0,122.3,83,118.8,94,147.9,95,13.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,78,0,225.1,67,199.2,127,175.5,102,14.6,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,45,0,78.2,127,253.4,108,255.0,100,18.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,106.4,84,140.2,104,90.9,81,11.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,27,283.4,104,224.1,152,241.3,63,14.4,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,1 +0,79,0,222.3,99,146.2,82,275.6,82,8.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,80,31,166.4,92,238.3,74,150.7,84,10.7,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,25,0,264.9,80,281.2,66,166.1,80,8.4,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,39,0,60.4,158,306.2,120,123.9,46,12.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,135,0,155.2,100,135.9,84,184.6,82,3.8,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,119,0,260.1,101,256.5,68,229.1,89,10.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,59,0,166.3,95,239.3,87,123.2,108,10.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,127,14,143.2,99,169.9,91,221.6,77,11.6,1,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,86,0,166.2,112,255.3,81,228.1,97,5.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,36,43,29.9,123,129.1,117,325.9,105,8.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,123,0,163.1,119,249.4,51,168.2,77,9.0,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 diff --git a/sagemaker_model_monitor/introduction/test_data/test_sample.csv b/sagemaker_model_monitor/introduction/test_data/test_sample.csv new file mode 100644 index 0000000000..686f9f958f --- /dev/null +++ b/sagemaker_model_monitor/introduction/test_data/test_sample.csv @@ -0,0 +1,120 @@ +186,0.1,137.8,97,187.7,118,146.4,85,8.7,6,1,0.2,0.3,0.4,0.5,0.6,0.7,0.8,0.9,0.10,0.11,0.12,0.13,0.14,0.15,0.16,0.17,1.1,0.18,0.19,0.20,0.21,0.22,0.23,0.24,0.25,0.26,0.27,0.28,0.29,0.30,0.31,0.32,0.33,0.34,0.35,0.36,0.37,0.38,0.39,0.40,0.41,0.42,0.43,0.44,0.45,0.46,0.47,0.48,0.49,0.50,0.51,0.52,0.53,1.2,1.3,0.54,1.4,0.55 +132,25,113.2,96,269.9,107,229.1,87,7.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +112,17,183.2,95,252.8,125,156.7,95,9.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +91,24,93.5,112,183.4,128,240.7,133,9.9,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +22,0,110.3,107,166.5,93,202.3,96,9.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +102,0,186.8,92,173.7,123,250.9,131,9.7,4,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +118,21,156.5,122,209.2,125,158.7,81,11.1,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +178,35,175.4,88,190.0,65,138.7,94,10.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +107,0,234.1,91,163.1,105,282.5,100,10.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +94,0,207.0,109,167.4,80,238.2,117,2.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +86,33,253.1,112,210.1,94,95.0,98,11.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +19,0,186.1,98,254.3,57,214.0,127,14.6,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +72,0,179.9,113,149.8,112,168.2,79,9.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +199,34,230.6,121,219.4,99,299.3,94,8.0,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +146,23,149.6,96,239.8,124,293.5,135,7.4,4,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +121,34,245.0,95,216.9,66,112.4,125,7.5,8,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +85,0,102.0,95,270.2,139,148.2,105,10.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +29,0,195.6,71,126.4,74,148.6,87,14.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +123,34,305.2,80,156.5,109,280.0,81,13.2,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +57,0,161.0,113,208.0,134,208.1,81,8.4,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +92,16,184.0,99,76.4,134,185.1,96,12.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +140,0,120.3,108,240.4,84,216.4,74,7.7,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +77,0,142.3,112,306.3,111,196.5,82,9.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +126,0,122.4,88,143.8,111,157.0,106,11.5,3,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +108,15,165.1,85,267.0,93,250.7,114,10.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +70,0,222.8,114,215.9,113,223.5,122,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +113,0,193.8,99,221.4,125,172.3,67,10.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +111,0,294.7,90,294.6,72,260.1,121,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +140,0,160.5,114,240.5,103,233.5,121,11.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +109,0,217.0,115,207.0,142,268.0,106,8.2,4,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +74,0,157.1,95,213.1,36,280.4,77,7.6,3,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +96,0,200.6,117,289.5,120,98.3,95,11.2,5,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +86,0,194.2,98,193.8,95,192.0,123,9.3,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +130,0,211.2,119,231.1,120,220.9,80,6.3,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +63,0,117.1,118,249.6,90,162.2,84,11.1,4,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +79,0,268.3,114,185.5,111,264.6,88,6.3,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +123,0,128.7,126,117.6,94,198.4,132,10.8,5,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +102,0,123.1,106,182.0,102,244.6,75,12.6,7,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +27,0,177.6,121,296.8,92,192.9,106,7.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +137,0,127.0,107,323.2,75,143.9,127,7.5,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +67,31,175.2,68,199.2,73,219.8,99,13.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +130,0,252.0,101,170.2,105,209.2,64,5.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +105,0,212.0,113,226.6,128,193.6,114,8.9,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +67,30,129.6,107,233.0,104,297.0,93,14.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +195,0,63.2,108,220.2,88,184.0,99,5.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +106,0,133.7,45,187.8,107,181.9,89,10.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +76,0,90.5,142,211.7,75,194.9,76,9.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +115,26,155.2,110,230.9,133,261.6,100,4.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +95,0,156.6,88,247.6,75,192.3,115,12.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +84,12,89.7,87,138.6,73,165.8,114,10.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +156,0,174.5,65,197.4,116,238.5,86,10.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +125,0,191.6,115,205.6,108,210.2,123,9.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +128,0,245.2,112,101.5,101,152.3,116,10.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +65,0,153.9,117,220.1,122,280.5,147,8.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +209,0,255.1,124,230.6,110,218.0,69,8.5,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +90,0,179.1,71,190.6,81,127.7,91,10.6,7,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +105,0,125.4,116,261.5,95,241.6,104,11.4,9,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +61,0,260.0,123,210.5,127,234.7,70,9.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +41,0,159.3,66,125.9,75,261.9,76,11.1,5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +132,10,182.9,54,292.4,68,142.3,116,11.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +130,0,263.7,113,186.5,103,195.3,99,18.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +85,0,210.3,66,195.8,76,221.6,82,11.2,7,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +146,0,160.1,63,208.4,112,177.6,98,9.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +95,0,203.4,96,168.6,61,173.0,105,13.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +72,0,141.3,133,134.9,96,227.5,97,11.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +74,0,162.7,102,292.0,105,183.3,80,8.7,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +133,0,187.0,65,141.4,128,238.2,108,10.0,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +90,0,200.9,92,164.3,91,249.0,98,8.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +100,0,235.8,130,176.0,69,63.6,122,7.3,1,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +56,0,210.4,80,176.6,96,149.7,56,15.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +183,31,171.2,104,193.6,74,196.5,85,10.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +138,0,194.3,83,189.9,97,232.2,102,9.0,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +86,0,190.5,115,179.6,130,258.5,89,10.1,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +86,39,261.2,122,214.2,101,154.9,101,12.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +83,0,78.1,70,239.3,115,144.4,112,12.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +72,33,96.6,59,315.4,98,163.3,117,6.2,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +64,19,291.1,150,226.7,123,219.1,67,7.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +160,0,206.3,66,241.1,109,227.8,102,11.7,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +85,33,207.9,95,233.5,88,221.3,92,13.5,3,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +81,0,220.8,77,148.5,87,183.9,100,7.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +121,0,181.5,121,218.4,98,161.6,103,8.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +56,0,121.6,84,165.3,115,243.9,95,8.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +71,0,185.0,84,232.5,129,191.1,82,14.9,4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +84,0,216.1,114,197.5,107,217.8,104,9.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +130,19,152.9,87,213.2,99,205.3,114,10.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +89,19,112.6,114,261.7,132,123.5,116,11.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +43,0,251.5,105,212.8,104,157.8,67,9.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,1,0 +80,0,124.3,100,173.0,107,253.2,62,7.9,9,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +104,0,109.1,141,187.1,140,216.6,100,10.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +111,0,78.3,119,198.2,94,248.5,94,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +116,0,63.7,101,195.8,95,210.1,87,10.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +71,23,175.7,82,258.9,136,268.4,154,14.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +10,0,222.2,127,153.1,125,227.4,80,12.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +106,0,169.4,107,197.2,71,202.2,79,10.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +85,0,222.3,132,231.5,101,223.5,75,11.0,2,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +149,0,237.6,79,192.4,107,207.4,111,9.1,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +182,0,279.5,118,203.2,113,174.2,101,10.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +120,0,149.2,98,193.6,88,248.9,119,11.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +92,0,265.6,82,180.7,75,211.1,113,8.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +116,24,232.9,90,152.1,94,344.3,82,10.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +133,0,201.7,85,169.4,116,286.3,80,6.0,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +81,0,145.4,132,129.3,91,186.4,109,5.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +97,0,217.6,81,320.5,51,150.7,110,4.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +171,0,137.5,110,198.1,109,292.7,131,13.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +104,0,130.5,77,131.2,117,264.7,63,13.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +50,0,295.3,127,127.4,100,166.8,105,9.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +99,0,254.4,120,159.3,92,264.4,94,6.0,5,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +16,0,209.5,89,172.8,85,94.1,102,8.8,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +11,24,131.5,98,230.2,111,283.7,87,10.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +120,0,185.7,133,235.1,149,256.4,78,16.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +140,28,157.1,77,172.4,97,184.5,94,11.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +108,0,239.3,102,223.4,127,251.4,104,10.6,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +51,0,214.8,94,149.7,58,283.4,66,10.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +15,0,121.1,130,216.0,86,235.1,33,16.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +67,0,109.1,117,217.4,124,188.4,141,12.8,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +68,0,159.5,123,240.8,93,210.3,76,11.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +23,31,156.6,84,161.5,96,294.6,107,9.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +92,0,176.3,85,93.4,125,207.2,107,9.6,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +138,0,46.5,104,186.0,114,167.5,95,9.6,4,4,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +93,0,176.1,103,199.7,130,263.9,96,8.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 diff --git a/sagemaker_model_monitor/introduction/test_data/training-dataset-with-header.csv b/sagemaker_model_monitor/introduction/test_data/training-dataset-with-header.csv new file mode 100644 index 0000000000..7d76778bdf --- /dev/null +++ b/sagemaker_model_monitor/introduction/test_data/training-dataset-with-header.csv @@ -0,0 +1,2334 @@ +Churn,Account Length,VMail Message,Day Mins,Day Calls,Eve Mins,Eve Calls,Night Mins,Night Calls,Intl Mins,Intl Calls,CustServ Calls,State_AK,State_AL,State_AR,State_AZ,State_CA,State_CO,State_CT,State_DC,State_DE,State_FL,State_GA,State_HI,State_IA,State_ID,State_IL,State_IN,State_KS,State_KY,State_LA,State_MA,State_MD,State_ME,State_MI,State_MN,State_MO,State_MS,State_MT,State_NC,State_ND,State_NE,State_NH,State_NJ,State_NM,State_NV,State_NY,State_OH,State_OK,State_OR,State_PA,State_RI,State_SC,State_SD,State_TN,State_TX,State_UT,State_VA,State_VT,State_WA,State_WI,State_WV,State_WY,Area Code_408,Area Code_415,Area Code_510,Int'l Plan_no,Int'l Plan_yes,VMail Plan_no,VMail Plan_yes +0,106,0,274.4,120,198.6,82,160.8,62,6.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,28,0,187.8,94,248.6,86,208.8,124,10.6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +1,148,0,279.3,104,201.6,87,280.8,99,7.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,0,191.9,107,206.9,127,272.0,88,12.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,92,29,155.4,110,188.5,104,254.9,118,8.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,131,25,192.7,85,225.9,105,254.2,59,10.9,6,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,69,0,143.6,88,141.8,86,194.0,83,10.8,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,0,114.3,100,221.1,103,126.3,88,10.9,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,90,0,193.7,83,154.2,79,299.0,60,12.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,60,0,125.1,99,248.8,62,211.3,79,11.2,3,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,0,203.2,81,152.5,99,197.8,76,9.7,3,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,75,39,198.2,107,280.4,132,129.6,73,11.3,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,54,0,273.8,113,119.6,156,267.6,117,11.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,0,190.5,128,205.5,103,130.7,63,13.8,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,189,30,155.2,116,195.5,50,170.1,108,15.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,84,0,203.4,125,182.9,88,213.7,121,13.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,87,36,171.2,138,185.8,102,227.6,97,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +1,60,0,289.8,101,255.6,115,242.8,76,11.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,40,0,81.7,123,210.2,108,212.0,64,11.3,3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,104,0,139.7,78,202.6,119,203.6,102,11.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,99,39,126.8,94,293.6,115,174.1,91,8.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,124,0,167.4,119,233.2,143,109.6,115,10.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,127,0,176.9,110,167.9,100,182.2,138,7.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,33,0,251.9,81,194.6,96,211.2,87,8.4,3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,113,0,159.8,143,210.1,93,175.1,86,13.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,39,0,187.2,110,114.7,116,104.7,83,13.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,100,0,113.3,96,197.9,89,284.5,93,11.7,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,206.3,97,154.9,98,263.6,82,12.4,12,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,127,28,235.6,124,236.8,113,241.2,127,7.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,69,0,228.2,70,263.7,80,142.6,60,10.7,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,116,0,288.0,120,255.8,90,233.4,99,13.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,145,0,187.9,110,197.0,117,167.0,108,4.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,190.4,91,92.0,107,224.8,108,13.6,17,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,145,0,245.8,116,286.7,91,240.7,115,9.0,13,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,40,31,224.7,69,134.5,81,120.3,104,7.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,43,0,241.9,101,129.4,121,264.8,104,5.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,147,0,168.6,92,187.7,107,216.5,95,14.4,8,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,68,0,232.4,76,153.3,103,214.6,107,10.5,2,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,22,124.5,94,231.7,90,222.2,108,6.4,12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,110,0,242.5,110,162.3,140,184.1,86,7.8,3,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,57,30,179.2,105,283.2,83,228.1,77,14.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,137,0,242.1,118,191.0,93,218.6,50,14.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,89,0,303.9,95,260.9,114,312.1,89,5.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,71,0,238.0,82,278.5,94,193.1,134,11.8,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,166,0,203.4,81,167.7,110,132.0,124,9.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,125,0,298.4,78,270.5,142,107.3,84,12.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,63,0,207.6,96,229.0,112,162.6,131,13.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,83,0,159.3,104,202.3,98,229.0,73,9.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,31,28,171.8,116,240.7,125,245.5,80,10.6,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,145,0,129.4,97,185.4,101,204.7,106,1.1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,146,0,205.4,101,134.9,77,310.5,83,10.3,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,12,0,216.7,117,116.5,126,220.0,110,9.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,136,0,204.5,63,208.8,95,224.0,119,9.8,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,110,0,185.1,100,165.1,88,111.6,104,6.3,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,138,0,146.5,101,284.5,142,176.0,98,14.0,6,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,107,28,201.8,79,304.9,128,225.6,133,11.9,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,1 +0,101,23,262.2,101,157.0,80,129.1,100,7.3,14,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,91,0,145.0,89,175.8,102,223.7,151,16.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,96,27,261.3,96,220.9,101,179.4,97,11.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,136,0,256.8,90,230.1,104,143.6,82,9.1,10,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,111,0,142.3,75,122.8,106,229.5,94,12.8,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,39,0,295.4,126,232.1,117,204.4,123,11.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,21,147.0,112,197.3,43,267.4,93,8.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,92,47,141.6,95,207.9,130,203.6,95,10.2,11,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,95,0,141.1,84,211.4,108,103.7,127,5.9,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,151,0,194.8,106,292.7,103,224.6,82,5.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,0,178.3,137,189.0,76,129.1,102,14.6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,0,198.4,91,264.7,106,111.4,101,9.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,0,193.6,58,148.7,115,282.5,105,13.1,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,0,245.9,73,240.1,87,158.7,89,8.9,5,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,0,115.6,77,213.6,100,218.4,72,10.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,34,0,151.0,102,131.4,101,186.6,86,9.9,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,72,0,287.4,116,235.3,126,292.1,114,5.0,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,104,0,164.2,109,155.4,90,168.9,117,10.7,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,150.5,92,120.3,95,271.2,96,9.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,160,0,171.2,103,243.5,121,178.2,92,13.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,0,54.7,131,256.1,105,176.6,135,11.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,1,0,175.2,74,151.7,79,230.5,109,5.3,3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,124,0,193.0,97,89.8,99,172.8,104,15.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,106,0,223.0,121,110.1,98,188.7,107,7.1,12,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,0,151.8,115,103.6,116,156.3,86,12.2,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,150,28,174.4,75,169.9,80,201.6,130,11.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,27,0,227.4,67,248.0,115,61.4,109,7.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,93.8,127,150.0,104,241.1,116,10.7,2,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,82,0,207.0,90,232.9,83,172.4,108,9.1,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,78,0,140.7,77,195.2,114,252.9,107,11.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,180,0,143.3,134,180.5,113,184.2,87,10.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,24,0,265.6,86,208.8,102,182.5,105,11.1,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,128,40,171.2,88,145.7,109,196.8,93,14.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,75,0,209.4,133,211.5,121,291.2,123,7.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,9,16,88.5,87,178.8,108,228.7,96,11.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,166,28,175.8,126,253.6,76,128.5,72,11.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,80,30,184.2,132,167.5,109,212.8,114,10.0,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,136,0,102.1,75,219.5,97,73.7,92,9.8,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,33,125.0,99,235.3,81,215.3,95,10.2,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,71,22,141.4,107,163.0,105,220.0,99,5.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,146,0,169.5,93,230.9,71,269.8,115,9.0,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,99,19,87.7,103,223.0,86,182.3,112,7.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,1 +0,80,0,202.4,118,260.2,67,177.4,112,9.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,140,0,173.2,91,196.8,106,209.3,128,11.2,5,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,13,21,315.6,105,208.9,71,260.1,123,12.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,82,34,232.6,121,153.2,115,286.7,77,4.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,123,39,270.4,99,245.1,110,108.9,113,15.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,113,23,205.0,101,152.0,60,158.6,59,10.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,42,0,241.2,134,116.5,114,152.2,91,10.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,86,0,83.8,121,240.2,96,158.6,108,6.7,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,103,0,166.6,84,192.4,91,167.9,115,7.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,0,191.3,134,261.5,113,182.3,111,10.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,47,0,47.8,120,178.9,123,152.6,96,13.3,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,101,0,124.8,66,257.2,85,193.2,115,13.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,24,118.1,83,109.6,72,245.5,73,16.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,87,0,124.3,91,173.4,105,256.3,109,7.5,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,100,0,203.1,96,217.0,126,180.9,122,13.5,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,122,0,136.7,115,243.1,137,188.9,110,8.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,144,51,283.9,98,192.0,109,196.3,85,10.0,4,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,132,0,206.2,100,211.2,118,196.2,122,10.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,140,0,159.1,104,269.8,106,220.4,116,10.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,31,107.7,124,188.9,104,196.2,98,8.9,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,90,0,113.2,108,189.3,63,271.8,124,14.1,4,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,136,0,259.4,99,172.7,125,293.7,78,10.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,109,0,170.7,101,240.2,82,119.0,112,11.4,4,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,134,0,202.7,105,224.9,90,253.9,108,12.1,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,106,0,235.2,121,220.6,87,236.3,91,11.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,68,0,226.7,94,168.4,129,188.7,117,10.2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,159,19,184.1,78,194.5,71,225.6,101,16.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,140,0,231.9,101,160.1,94,110.4,98,14.3,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,0,149.9,95,256.1,110,212.7,92,13.3,13,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,65,0,105.7,95,141.8,100,180.5,105,6.6,12,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,52,0,251.4,118,196.6,80,192.0,53,11.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,93,21,117.9,131,164.5,115,217.0,86,9.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,75,41,130.9,115,203.4,110,171.7,68,12.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,0,1 +0,72,0,253.0,73,219.3,78,210.8,89,9.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,0,251.6,88,175.1,103,184.4,112,5.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,140.1,132,209.6,126,264.1,77,8.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,57,0,158.1,117,115.2,149,182.4,92,11.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,21,197.9,99,165.6,100,208.0,120,10.1,9,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,98,21,64.6,98,176.1,86,244.8,84,0.0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,71,39,183.2,103,209.4,111,172.4,109,11.9,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,156,0,178.8,94,178.4,97,169.2,77,7.5,3,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,0,167.8,121,212.9,123,208.2,73,13.0,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,165,0,156.0,88,276.1,81,175.9,94,9.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,73,0,175.4,130,248.1,105,122.4,85,12.2,4,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,14,80.2,81,219.0,103,122.6,102,8.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1 +0,121,0,177.2,142,123.5,88,213.2,51,8.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,55,20,189.3,95,118.6,113,250.2,102,12.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,88,31,181.6,91,213.2,120,207.8,104,11.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,84,42,165.3,97,223.5,118,260.8,72,7.6,7,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,117,0,144.6,115,258.8,66,253.2,113,7.4,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,0,211.3,120,162.6,122,134.7,118,13.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,99.4,62,275.0,86,212.1,94,16.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,149,18,148.5,106,114.5,106,178.3,98,6.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,140,0,125.3,84,167.6,121,260.6,94,8.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,82,0,189.2,81,184.4,117,255.8,83,10.6,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,55,0,189.0,100,118.5,99,248.1,87,17.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,41,41,207.3,95,137.3,120,115.7,74,5.9,3,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,190,0,182.2,101,212.3,95,233.0,123,9.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,55,0,132.0,103,279.6,114,180.0,74,13.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,134,0,242.4,126,152.9,115,318.3,115,11.8,6,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,45,0,78.6,106,187.3,110,184.2,111,7.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,70,0,126.3,99,141.6,106,255.9,96,9.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,135,0,190.9,44,161.4,109,231.9,100,8.4,2,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,28,95.9,117,159.5,131,152.8,132,10.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1 +0,138,0,220.2,89,88.3,125,195.3,79,12.9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,0,257.3,84,184.8,115,108.9,109,13.5,7,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,121,0,190.7,103,183.5,117,220.8,103,9.8,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,126,26,129.3,123,176.5,114,154.5,102,9.6,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,95,0,149.2,96,260.7,116,201.0,120,8.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,143,0,119.1,117,287.7,136,223.0,100,12.2,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,76,0,143.7,55,173.1,108,239.1,95,5.8,6,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,0,147.1,80,199.7,100,160.7,106,13.7,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,75,0,153.2,78,210.8,99,153.5,100,7.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,116,0,146.4,123,176.6,113,212.6,102,7.8,5,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,197,0,127.3,80,222.3,115,173.9,95,13.7,5,5,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,91,31,273.0,78,215.5,98,104.7,114,9.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,115,0,345.3,81,203.4,106,217.5,107,11.8,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,16,0,144.8,84,164.9,141,231.5,75,8.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,103,0,173.5,83,244.3,65,221.6,66,9.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,35,245.4,89,148.2,102,274.0,136,7.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,51,0,153.6,108,232.9,85,214.2,92,14.1,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,62,0,182.3,101,328.2,93,245.0,131,11.2,1,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,90,0,102.0,118,113.3,134,188.6,105,11.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,95,41,136.8,91,200.8,61,133.7,67,10.3,9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1 +0,116,0,167.8,119,142.0,123,190.7,128,7.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,138,0,286.2,61,187.2,60,146.2,114,11.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,82,29,207.2,111,254.1,137,169.3,92,9.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,78,13,281.2,93,178.2,101,244.2,129,6.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,83,0,134.8,96,167.2,78,161.5,123,7.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,108,0,187.4,101,199.9,126,216.1,107,12.6,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,212,0,126.0,96,144.3,80,302.8,102,7.6,3,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,71,0,207.0,112,173.8,96,178.4,61,12.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,32,116.9,120,232.4,97,127.7,112,11.0,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,48,0,275.2,67,180.2,108,159.0,110,7.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,79,32,50.6,62,201.4,87,146.8,121,4.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,173,0,172.5,78,142.6,91,102.0,63,10.9,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,118,0,154.8,71,244.0,73,159.6,81,12.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,145,30,175.3,107,153.3,116,233.6,85,11.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +1,136,0,269.8,106,228.8,101,257.5,106,10.1,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,33,241.7,84,165.8,84,160.6,80,11.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,112,0,174.5,127,259.3,71,170.5,120,11.3,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,132,0,117.6,66,214.0,108,239.5,94,8.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,82,0,135.4,102,237.1,122,118.3,91,17.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,204.5,108,162.4,110,155.0,102,13.4,1,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,114,31,174.5,104,224.2,92,116.3,91,12.3,10,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1 +1,37,0,239.9,120,261.6,88,207.1,88,8.9,4,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,107,32,134.2,101,211.9,145,167.6,138,8.2,5,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,52,32,130.1,68,247.2,77,289.4,87,13.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,52,21,195.7,119,106.2,95,157.4,94,5.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,77,0,67.7,68,195.7,86,236.5,137,12.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,177,0,232.8,106,175.2,97,212.2,77,12.5,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,113.0,80,150.1,87,204.3,115,10.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,158,0,158.0,106,292.5,114,241.1,89,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,34,138.8,80,142.0,108,183.8,77,11.8,7,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,65,0,245.7,139,241.9,113,285.3,117,4.2,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,24,154.8,69,177.2,105,207.6,102,9.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,136,0,152.6,97,208.9,85,119.1,99,5.0,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,110,0,18.9,92,258.4,81,109.6,74,14.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,89,0,213.0,63,176.6,71,262.6,126,9.1,1,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,194.4,101,190.3,82,183.4,107,11.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,154,35,64.9,76,184.1,91,151.6,75,14.6,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,0,1 +0,36,0,253.4,77,182.4,151,275.8,103,8.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,183,8,86.5,119,285.2,97,180.4,133,8.7,2,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,113,20,157.8,83,161.5,56,271.5,100,8.7,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,137,0,110.5,79,223.2,111,169.5,64,10.5,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,61,0,45.0,108,151.3,74,152.9,94,9.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,124,0,151.0,98,120.6,119,152.8,81,9.2,2,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,87,0,322.5,106,204.6,93,186.2,128,9.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,144,33,251.6,87,197.6,118,209.2,97,12.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,157,0,168.6,71,205.1,48,175.8,88,5.9,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,86,0,217.8,93,214.7,95,228.7,70,11.3,7,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,75,0,166.3,125,158.2,86,256.7,80,6.1,5,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,105,0,226.9,106,182.2,77,203.9,107,11.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,118.6,89,199.6,97,53.3,61,11.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,0,157.3,116,197.5,77,128.2,111,8.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,111,0,229.4,107,214.1,99,289.6,95,10.4,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,189.3,119,233.5,112,270.9,104,10.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,136,0,92.0,117,253.6,77,214.1,90,10.3,10,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,67,0,179.8,125,173.2,86,272.8,97,10.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,137,0,144.0,90,181.6,100,128.1,93,12.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,70,24,249.5,101,259.7,98,222.7,68,9.8,4,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,148,26,244.9,150,118.0,138,236.0,91,15.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,88,0,148.2,82,308.7,67,235.4,79,6.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,80,0,276.5,122,195.6,79,210.3,78,7.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,82,33,137.8,95,235.5,128,268.1,70,11.0,6,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,104,0,170.6,97,162.1,111,210.7,131,6.1,1,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,94,0,85.9,113,226.7,91,279.6,110,15.6,16,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,66,0,170.5,103,254.3,77,197.3,138,10.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,21,214.0,113,180.0,114,134.5,82,10.6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,88,0,85.7,112,221.6,70,190.6,75,11.6,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,0,198.4,121,249.5,104,162.8,115,10.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,1,0 +0,80,0,199.8,138,167.1,91,271.8,94,5.5,4,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,35,0,158.6,67,130.4,96,229.8,80,6.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,115,0,184.1,98,327.0,73,212.5,106,7.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,34,208.8,119,142.1,106,214.6,87,12.5,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,95,0,134.4,104,152.4,95,236.5,80,9.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,153,0,185.3,127,208.0,73,206.1,124,15.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,0,157.6,92,198.3,87,364.9,106,9.1,4,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,89,0,82.3,77,167.2,80,194.7,70,7.2,4,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,149,0,175.4,80,197.4,127,188.2,102,9.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,100,0,70.8,94,215.6,102,230.8,125,9.5,1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,172,0,172.5,85,253.1,71,221.6,113,5.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,0,152.5,131,252.4,107,185.4,104,4.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,36,25,152.8,110,242.8,67,147.4,74,9.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,81,0,227.4,105,211.5,120,258.2,113,11.9,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,79,34,103.7,100,236.3,78,256.6,102,14.8,4,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,139,0,211.1,103,206.9,108,193.9,70,5.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,22,196.0,82,322.7,82,225.6,120,3.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,102,0,158.0,94,207.9,100,190.4,120,10.1,10,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,88,0,235.1,98,251.8,79,285.9,76,7.2,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,176,0,201.9,101,154.7,78,164.4,79,9.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,105,0,281.3,124,301.5,96,202.8,109,8.7,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,160,0,256.0,111,187.4,61,119.1,81,11.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,48,0,188.4,63,165.9,89,205.7,71,13.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,95,0,135.0,99,183.6,106,245.3,102,12.5,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,99,0,124.6,90,146.4,70,169.4,95,10.5,6,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,229.3,103,177.4,126,189.3,95,12.0,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,94,0,212.1,98,189.4,89,352.2,95,8.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,40,210.0,116,232.7,89,168.8,94,5.9,4,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,70,0,232.8,95,303.4,111,255.6,104,12.9,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,28,0,180.8,109,288.8,58,191.9,91,14.1,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,93,42,166.9,101,273.2,84,171.0,106,11.5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,1 +0,138,0,251.0,119,91.2,96,142.2,87,13.8,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,206.9,115,224.4,86,197.4,60,8.3,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,98,0,227.1,116,120.5,103,117.0,102,4.7,4,5,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,137,0,97.5,95,195.8,82,288.8,78,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,174,0,124.3,76,277.1,112,250.7,115,15.5,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,149,20,264.4,102,219.6,123,200.4,89,11.3,3,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,163,0,191.3,89,193.9,87,268.4,121,12.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,0,184.8,83,248.6,101,133.1,113,9.6,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,89,0,105.9,151,189.6,142,170.9,67,12.7,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,0,239.3,84,195.7,85,232.6,104,10.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,148,14,93.6,137,193.8,72,144.9,84,17.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,118,39,153.8,106,123.3,111,117.8,103,9.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,17,0,161.5,123,214.2,81,315.0,106,8.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,35,37,181.2,76,177.6,98,228.0,136,5.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,98,0,236.2,122,189.4,110,153.6,104,13.3,4,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,98,0,206.5,92,176.2,152,232.8,115,12.4,5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,40,0,220.8,100,265.7,106,212.8,94,6.4,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,93,0,164.5,95,230.9,87,149.9,91,9.9,3,4,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,101,33,219.7,137,186.8,94,184.5,113,9.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,132,0,169.9,107,209.4,121,206.1,79,11.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,150,0,126.0,99,238.5,73,285.1,100,10.2,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,59,0,195.0,58,198.5,88,304.3,110,14.8,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,42,0,180.7,127,174.6,94,165.3,114,12.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,86,0,223.9,75,155.7,109,150.2,143,7.3,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,75,0,117.3,114,201.1,61,107.9,82,12.2,3,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,133.8,85,180.5,94,112.2,115,8.9,4,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,0,153.1,102,234.1,77,329.2,74,9.9,9,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,65,0,277.9,123,155.8,112,256.9,71,9.2,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,113,0,158.9,137,242.8,109,247.8,97,6.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,69,0,135.4,101,238.1,124,195.6,102,10.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,60,0,207.8,109,123.5,112,291.6,115,5.7,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,0,107.8,113,216.6,125,217.5,92,9.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,133,0,162.1,91,212.1,94,260.4,78,12.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,85,0,183.4,111,168.8,98,199.7,97,9.9,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,150,0,189.3,77,220.9,105,238.7,117,9.2,5,4,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,125,0,126.7,108,206.0,90,247.8,114,13.3,7,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,98,0,217.2,121,303.4,73,197.1,71,12.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,0,163.6,88,283.4,93,262.1,108,8.6,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,102,0,156.5,67,204.3,103,141.9,72,9.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,97,25,141.0,101,212.0,85,175.2,138,4.9,2,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,82,0,143.7,116,170.7,99,287.7,95,7.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,1,0,144.8,107,112.5,66,218.7,79,13.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,173,0,109.4,103,101.3,111,167.3,106,7.8,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,32,0,230.9,87,187.4,90,154.0,53,6.3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,62,0,186.8,94,207.6,92,195.0,98,8.8,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,230.9,93,223.0,78,157.8,101,9.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,243,0,95.5,92,163.7,63,264.2,118,6.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,110,0,222.7,94,105.8,98,214.8,78,13.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,0,152.2,112,177.2,132,96.4,87,5.3,3,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,131,28,249.6,87,227.2,138,239.9,92,7.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,81,0,129.9,121,230.1,105,140.5,123,13.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,124,0,188.5,77,182.0,123,218.2,127,6.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,0,75.0,116,248.7,87,176.0,83,9.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,37,39,149.7,122,211.1,75,114.3,90,9.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,107,30,198.9,87,207.0,90,159.8,76,12.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,73,0,240.3,130,162.5,83,231.9,136,11.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,68,0,222.1,107,199.4,102,162.4,107,9.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,148,36,77.6,141,207.0,60,255.7,115,10.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,67,0,167.8,91,167.7,69,110.3,71,8.4,12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,53,37,167.3,99,194.7,99,236.7,112,12.0,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,142,0,187.0,133,134.6,74,242.2,127,7.4,5,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,83,25,191.3,95,250.7,136,249.4,86,17.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,117,0,201.9,86,212.3,96,176.9,98,7.8,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,137,0,147.2,119,192.8,91,172.7,105,10.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,115,0,146.7,128,106.2,74,197.7,104,11.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,71,0,178.2,113,167.8,94,182.1,111,13.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,146,0,206.3,151,148.6,89,167.2,91,6.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,114,32,125.2,79,177.8,105,232.4,89,12.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,107,0,103.4,94,189.3,125,227.2,125,14.4,3,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,0,201.8,82,231.5,95,226.1,130,16.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,126,23,114.3,102,190.3,103,240.4,111,12.6,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,85,21,283.2,110,239.7,108,149.5,80,6.3,1,5,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,43,0,168.4,125,243.8,89,214.7,102,11.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,20,0,186.8,89,253.4,51,273.1,105,12.3,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,133,0,176.8,92,187.5,97,196.8,88,6.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,100,0,210.9,85,329.3,69,127.1,78,9.4,5,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,108,0,169.6,99,264.1,87,206.3,78,9.3,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,155,0,163.0,93,203.9,102,159.0,109,15.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,146,0,189.3,77,155.9,128,186.0,83,7.4,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,98,31,121.0,105,218.9,98,226.7,110,12.0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,80,0,197.5,114,206.9,119,163.6,109,11.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,158.6,112,220.0,114,252.9,106,9.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,0,252.9,93,178.4,112,263.9,105,9.5,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,0,51.1,106,208.6,137,198.0,92,12.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,141,22,215.4,123,328.7,98,160.5,89,7.8,6,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,77,0,239.2,114,150.0,115,160.8,81,10.3,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,143,0,172.3,97,174.0,108,188.2,119,13.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,72,0,147.0,79,162.3,103,162.9,80,10.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,22,240.8,102,75.9,106,224.6,115,7.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,126,0,239.7,87,281.7,92,183.5,113,11.4,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,80,0,198.1,160,156.7,87,182.1,76,9.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,72,0,138.9,111,211.6,102,179.5,91,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,115,0,139.3,89,192.3,95,151.0,75,9.3,3,7,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,43,0,177.2,93,142.6,60,314.1,144,12.7,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,100,0,185.0,122,182.5,92,274.9,92,5.1,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,0,221.3,140,157.8,89,192.5,89,11.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,77,0,189.5,112,207.0,95,214.1,91,9.2,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,88,0,144.3,116,156.4,74,214.7,90,7.8,10,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,55,0,119.7,148,231.8,96,222.3,113,4.6,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,95,32,262.2,123,165.2,82,194.3,57,10.6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,143,0,160.4,120,285.9,104,182.5,85,6.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,142,25,191.1,109,149.6,120,227.8,60,9.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,179,38,220.1,78,234.3,71,237.3,85,10.1,4,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,154,32,192.3,82,165.3,134,205.0,74,9.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,118,0,133.4,113,121.0,92,254.7,129,5.9,4,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,142,0,84.8,95,136.7,63,250.5,148,14.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,193,17,124.0,102,202.9,81,205.1,129,12.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,1 +0,76,0,171.1,78,257.2,83,91.6,92,16.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,106,26,270.3,111,215.2,90,254.0,133,14.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1 +0,116,35,200.4,104,272.8,89,214.5,100,8.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +1,68,29,195.5,113,171.6,96,204.0,85,13.5,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,121,21,126.3,84,209.6,102,192.5,129,10.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,104,0,138.7,107,256.9,113,234.9,74,10.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,157,0,224.5,111,200.7,99,116.6,118,11.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,102,0,195.7,116,209.1,87,201.1,73,8.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,70,0,230.3,110,77.9,87,247.1,105,13.2,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,82,29,163.8,77,134.9,112,79.3,95,8.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,67,0,260.4,107,208.2,104,207.9,115,10.0,2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,75,19,210.3,90,241.8,87,215.7,102,13.1,3,4,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,22,14,199.1,100,221.8,103,65.7,91,4.2,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,116,0,89.5,128,180.8,137,193.1,94,14.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,154,0,166.9,99,154.9,97,189.4,89,7.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,117,0,134.7,121,180.0,83,200.9,104,7.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,0,183.4,103,141.9,113,200.4,122,10.4,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,24,0,156.2,104,90.0,101,205.1,116,7.3,5,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,31,160.3,45,221.5,70,261.6,109,5.6,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,0,1 +0,120,27,153.5,84,194.0,73,256.5,94,10.2,7,5,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,114,31,222.8,98,180.5,105,151.3,101,13.0,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,127,0,261.7,105,181.8,107,100.9,131,3.3,5,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,121,44,116.0,85,150.1,120,246.8,98,12.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,103,29,164.1,111,219.1,96,220.3,108,12.3,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,55,39,139.3,101,178.3,117,246.5,104,8.1,1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,66,16,174.7,92,232.1,105,305.4,98,8.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,130,0,183.0,112,72.9,99,181.8,78,9.5,19,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,33,200.3,75,226.6,67,198.8,91,12.9,3,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,40,41,148.1,74,169.5,88,214.1,102,6.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,0,1 +0,84,0,191.0,88,318.8,119,247.3,79,6.5,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,115,0,206.2,113,176.4,102,297.1,119,11.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,177,0,84.9,77,257.5,109,210.5,66,7.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,51,26,236.8,61,263.4,97,181.1,91,11.2,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,103,18,149.9,84,170.9,84,171.5,112,11.5,7,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,144,18,106.4,109,108.1,113,208.4,111,10.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,21,0,92.6,95,161.9,70,285.0,78,11.3,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,135,28,201.4,100,246.5,117,154.8,131,12.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,111,0,172.8,58,183.1,108,158.8,104,7.9,3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,103,0,255.9,128,140.9,92,308.9,130,12.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,123,0,114.8,94,150.0,104,268.6,119,9.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,96,37,172.7,93,120.1,116,216.1,86,10.3,5,5,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,87,19,165.8,122,186.9,89,249.7,78,0.0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,85,0,126.1,112,274.7,126,184.4,95,9.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,162,0,135.2,98,242.0,107,246.9,96,10.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,101,0,239.0,156,273.0,106,278.2,93,13.5,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,93,0,271.1,101,237.4,133,145.4,103,8.4,6,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,145.5,116,228.4,110,273.4,91,8.9,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,110,0,241.2,105,174.3,85,245.3,59,8.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,106,0,114.4,104,78.3,101,232.7,78,0.0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,91,0,154.4,165,168.3,121,239.9,81,11.7,4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,140.4,112,187.1,60,207.9,155,7.9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,185,0,139.6,92,250.2,115,158.1,79,10.8,4,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,99,0,115.5,75,218.1,111,254.9,98,11.5,7,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,105.8,110,43.9,88,189.6,87,13.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,0,155.2,79,235.3,123,169.4,80,8.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,41,0,223.8,67,244.8,74,223.8,156,12.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,156,0,174.3,95,186.6,128,258.2,105,12.9,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,101,28,105.9,132,231.7,107,281.3,120,10.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,78,0,193.4,99,116.9,88,243.3,109,9.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,184,12,200.3,76,253.6,105,149.3,93,10.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1 +0,68,0,195.4,116,212.1,101,138.4,134,15.1,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,137,0,243.4,114,121.2,110,162.6,104,12.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,159,0,169.8,114,197.7,105,193.7,82,11.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,142,0,145.4,93,209.1,98,214.0,96,10.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,114,0,189.8,101,147.7,80,172.7,121,10.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,0,0.0,0,192.1,119,168.8,95,7.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,113,0,215.6,96,193.4,127,105.4,115,13.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,73,0,254.8,85,143.4,80,153.9,102,15.0,7,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,0,134.8,94,204.1,106,238.4,109,6.7,8,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,55,0,245.5,130,192.7,54,141.7,83,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,187.7,127,163.4,148,196.0,94,9.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,0,239.1,88,243.5,79,230.9,92,10.9,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,111,0,246.5,108,216.3,89,179.6,99,12.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,77,0,144.9,136,151.3,115,252.4,73,12.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,30,144.5,35,262.3,101,226.5,82,12.0,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,53,27,25.9,119,206.5,96,228.1,64,6.5,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,151,0,198.7,70,209.5,106,281.9,126,12.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,117,23,198.1,86,177.0,86,180.5,92,6.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,45,22,196.6,84,313.2,92,163.3,108,11.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,92,0,181.4,98,164.5,98,171.0,110,10.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,169,0,235.7,79,136.9,85,220.9,97,13.3,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,176.4,122,224.9,123,219.6,50,11.5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,36,0,202.4,115,230.7,115,202.0,127,10.2,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,213.5,93,166.6,114,122.0,78,14.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,113,0,155.0,93,330.6,106,189.4,123,13.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,144,38,105.0,86,121.8,123,221.5,122,3.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +1,161,0,322.3,100,230.4,135,241.5,104,7.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,99,0,200.0,66,107.9,104,233.7,82,11.4,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,37,153.5,78,241.9,108,244.7,110,10.6,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,65,0,148.7,80,259.0,94,149.5,107,12.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,121,28,110.0,94,141.5,76,237.3,87,6.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,1 +0,35,0,179.2,59,283.3,101,285.4,83,5.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,100,0,264.5,117,194.0,111,262.7,111,7.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,119,0,230.4,117,225.0,101,198.5,111,7.6,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,6,0,226.5,93,152.1,122,164.4,98,9.4,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,126,0,256.5,112,199.5,90,188.3,122,7.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,170.5,113,193.2,129,188.0,91,11.2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,3,0,139.0,99,250.7,108,286.2,87,6.1,3,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,180,0,139.0,96,224.9,64,170.8,118,15.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,137,0,230.2,113,220.4,79,204.7,111,10.7,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,23,224.2,106,189.6,100,222.8,75,9.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,10,0,186.1,112,190.2,66,282.8,57,11.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,113,0,90.6,130,170.6,100,137.4,74,5.4,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,126,30,153.4,90,151.4,97,153.8,97,12.8,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,166,0,152.1,95,121.0,105,198.0,126,9.8,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,0,205.0,90,140.9,114,272.6,96,7.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,172,0,212.0,121,31.2,115,293.3,78,12.6,10,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,114,25,129.0,77,290.0,110,177.1,110,11.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,36,0,235.1,97,196.8,104,259.7,110,7.0,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,53,0,145.1,116,233.7,82,208.7,95,7.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,166,0,136.1,116,181.4,93,131.4,108,11.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,0,222.4,85,165.4,76,208.4,97,11.2,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,58,0,234.8,89,106.8,131,178.5,122,9.9,6,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,141,37,258.6,84,222.0,111,326.4,97,11.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1 +0,70,0,198.6,111,213.9,115,171.2,105,10.6,6,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,108,0,210.7,112,238.7,73,253.6,90,9.2,5,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,9,31,193.8,130,202.6,98,191.2,102,13.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,148,0,148.2,138,159.6,123,197.4,62,8.6,3,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,111,24,205.5,114,219.3,99,215.9,95,14.0,4,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,137,0,174.4,120,156.3,98,136.5,121,10.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,156,0,237.7,122,181.5,91,185.7,151,7.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,133,0,245.8,102,264.7,90,111.7,103,11.2,7,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,101,0,158.4,92,188.0,117,219.7,125,13.5,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,89,29,163.5,80,274.8,136,381.9,147,7.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,132,0,121.5,88,253.0,124,195.7,120,10.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,173,0,191.4,114,168.5,138,109.3,99,10.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +1,88,0,166.7,61,179.3,88,242.7,131,6.8,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,0,163.4,83,249.3,119,249.7,90,9.8,4,7,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,115,0,286.4,125,205.7,74,191.4,141,6.9,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,141,0,83.2,74,190.6,104,150.5,79,10.7,7,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,0,203.3,45,141.9,87,200.7,71,8.5,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,184,0,236.4,73,287.3,120,192.0,94,13.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,0,180.0,100,229.0,103,139.4,105,7.8,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,48,0,197.7,64,136.7,126,244.4,81,13.2,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,124.8,114,133.0,121,160.3,85,10.6,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,135,0,201.8,81,225.0,114,204.4,82,10.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,71,0,104.0,92,197.0,125,110.1,123,14.6,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,21,19,132.7,94,204.6,101,154.7,78,12.9,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,181,0,161.3,83,124.4,83,262.0,98,14.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,67,34,161.7,114,207.6,115,205.7,114,9.2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,68,0,148.8,70,246.5,164,129.8,103,12.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,140,0,129.6,79,246.2,99,172.1,124,9.4,10,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,61,0,188.9,105,153.6,116,213.3,106,10.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,139,0,138.1,103,164.5,100,134.9,63,8.3,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,32,168.4,129,225.9,97,191.8,95,8.5,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,98,0,158.4,71,306.6,66,144.2,93,2.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,217,0,123.7,138,248.5,105,269.6,78,13.3,4,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,0,137.0,128,217.0,116,182.1,86,10.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +1,68,0,249.9,127,254.5,118,273.2,98,8.9,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,114,0,169.2,96,149.9,83,196.9,119,4.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,149,20,147.8,132,276.8,94,149.9,110,10.2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,129,27,196.6,89,180.6,95,245.0,83,6.6,5,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,67,0,310.4,97,66.5,123,246.5,99,9.2,10,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,126,27,186.2,78,189.6,83,76.5,139,9.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,96,0,98.2,100,307.2,88,182.5,120,7.6,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,85,0,211.5,100,184.6,88,164.3,131,13.3,4,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,76,0,272.7,97,236.4,95,235.5,105,7.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,61,0,78.2,103,195.9,149,108.0,100,10.1,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,110,0,196.1,103,199.7,123,135.9,71,12.9,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,71,0,141.2,132,149.1,90,171.4,72,7.0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,88,0,181.5,116,187.0,119,220.3,96,10.5,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,149,0,180.9,79,194.9,83,197.8,109,8.8,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,164,0,160.6,111,163.2,126,187.1,112,9.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,129,0,159.1,100,202.5,90,233.1,96,11.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,68,0,237.1,105,223.5,105,97.4,79,13.2,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,23,157.6,129,247.0,96,259.2,112,13.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,1 +1,117,0,167.1,86,177.5,87,249.4,132,14.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,1,0 +0,120,40,128.1,99,247.7,78,199.7,121,15.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,70,0,208.7,97,275.5,83,182.5,122,8.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,143,0,167.8,72,211.0,99,153.5,109,10.5,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,78,0,168.3,110,221.2,73,241.0,136,12.5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,147,0,248.6,83,148.9,85,172.5,109,8.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,95,0,237.3,83,154.0,65,237.0,105,11.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,48,0,210.8,84,189.6,98,157.6,99,16.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,25,0,178.8,90,141.2,72,203.0,99,8.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,166,0,47.7,89,264.4,95,235.2,97,13.2,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,81,31,210.4,100,225.5,97,168.7,120,9.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,155,39,183.3,106,205.1,101,263.7,90,5.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,174,0,239.2,72,188.5,124,105.6,116,8.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,116.2,86,229.7,127,204.2,109,10.1,3,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +1,169,0,266.7,105,158.2,88,287.7,111,13.8,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,0,110.9,54,213.4,82,186.2,116,7.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,81,0,154.5,84,216.2,91,229.8,82,13.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,105,0,115.5,73,267.3,83,114.2,90,13.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,181,0,143.3,91,195.5,58,223.3,95,6.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,89,35,174.4,108,196.7,100,127.4,74,11.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,52,0,214.7,68,158.6,138,123.4,114,9.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,101,0,153.8,89,234.0,89,196.3,77,11.6,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,127,0,107.9,128,187.0,77,218.5,95,0.0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,44,254.1,127,180.2,108,196.2,129,8.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1 +0,25,0,134.3,98,202.3,109,195.9,100,12.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,121,0,168.6,121,168.6,94,95.3,59,12.3,4,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,97,0,211.0,76,189.0,100,123.0,102,4.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,90,0,37.8,80,155.3,105,175.0,111,14.2,5,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,100,26,153.7,115,137.8,146,213.5,104,15.9,5,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,75,27,117.5,102,206.8,127,194.4,114,4.2,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,121,0,167.7,94,93.7,121,241.3,115,13.4,1,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,88,0,152.9,119,171.2,107,257.0,106,12.0,5,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,127,19,129.7,115,160.8,101,265.0,63,12.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,7,0,206.7,87,281.1,83,158.5,77,11.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,136.2,92,220.9,110,196.9,116,13.3,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,65,0,187.9,116,157.6,117,227.3,86,7.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,118,0,160.0,123,175.4,96,184.8,99,9.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,100,0,131.1,108,176.2,81,89.7,81,4.3,4,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,81,0,145.6,59,287.9,131,181.7,121,9.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,92,38,242.2,96,159.7,144,210.0,108,8.9,1,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,76,0,204.0,69,225.1,110,240.3,85,9.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,141,0,192.4,111,156.9,87,175.8,82,11.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,32,154.0,80,185.5,91,148.2,107,8.2,4,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,190,0,142.9,96,177.9,96,113.3,117,6.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,125.4,158,269.1,83,238.6,103,11.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,86,23,225.5,107,246.3,105,245.7,81,9.8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,137,0,141.1,91,147.2,100,254.7,75,8.0,7,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,0,187.3,84,270.8,95,206.4,68,10.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,79,0,147.0,72,165.7,102,243.2,107,8.4,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,177,0,175.7,120,168.6,90,198.9,110,14.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,45,0,142.4,107,318.7,78,224.1,108,11.1,7,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,10,0,183.0,103,214.8,77,206.4,73,8.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,25,215.9,90,257.9,92,180.2,157,11.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,1 +0,86,38,123.0,158,133.9,119,138.2,103,13.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +1,126,0,249.8,96,261.9,92,166.8,108,12.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,99,0,142.3,89,204.5,95,203.1,114,9.1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,25,0,119.3,87,211.5,101,268.9,86,10.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,104,0,278.4,106,81.0,113,163.2,137,9.8,5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,62,0,128.7,111,169.5,104,193.6,97,10.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,70,0,177.4,125,226.2,104,254.1,72,10.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,37,0,206.0,89,186.0,88,307.1,86,8.4,11,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,0,247.3,91,182.7,60,143.2,112,14.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,224.9,102,143.8,87,198.9,105,8.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,192,36,156.2,77,215.5,126,279.1,83,9.9,6,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,99,0,145.6,102,230.9,87,181.5,86,11.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,58,0,174.4,112,265.8,122,182.4,87,0.0,0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,125,0,163.8,73,255.6,85,192.9,95,15.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,0,256.5,87,222.1,101,156.7,122,13.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,166,35,128.2,138,274.5,113,298.9,130,8.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,1,0,1 +0,130,0,176.3,140,201.0,104,161.9,123,11.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,135,0,186.0,107,66.0,94,213.1,105,12.9,4,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,145,31,216.0,94,225.1,123,234.7,109,10.7,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,0,1 +0,82,19,146.5,73,246.4,65,199.0,114,4.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,107,0,222.3,101,286.0,111,249.4,117,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,23,253.0,78,138.9,121,277.8,104,11.8,3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,151,26,196.5,98,175.8,111,221.8,124,13.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,131,0,109.5,95,332.1,48,258.6,108,6.6,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,82,0,143.9,61,194.9,105,109.6,94,11.1,2,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,141,32,148.6,91,131.1,97,219.4,142,10.1,1,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,158,0,220.9,129,242.2,108,233.3,75,6.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,107,0,260.5,108,102.4,110,129.7,148,9.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,119,0,190.4,74,215.6,113,161.2,111,10.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,134,0,208.3,86,253.6,89,291.0,86,12.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,39,36,141.7,121,232.3,113,222.1,131,12.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,85,0,216.0,73,188.2,117,147.1,98,3.6,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,60,0,145.0,133,209.1,92,328.5,112,14.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,0,287.1,115,159.3,99,216.8,86,13.9,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,63,0,164.5,75,147.9,118,252.7,97,11.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,69,37,155.0,98,142.4,105,143.7,117,5.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,71,0,277.5,104,131.8,121,126.9,101,8.2,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,72,0,207.8,92,195.7,110,184.8,124,13.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,217.1,76,205.2,100,185.7,91,9.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,142,38,163.3,104,136.0,114,249.1,127,4.3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,74,0,124.0,102,262.1,101,268.2,98,11.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,21,31,135.9,90,271.0,84,179.1,89,9.5,7,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,165,33,111.6,140,213.3,111,267.6,115,16.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,73,0,254.7,80,90.2,79,153.4,60,10.6,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,176,0,169.5,151,112.9,84,56.6,99,8.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,41,0,182.1,89,211.5,104,207.4,124,6.8,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,91,0,133.8,61,158.8,96,189.6,92,10.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,157,28,192.6,107,195.5,74,109.7,139,6.8,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,106,33,81.6,120,235.6,85,150.9,113,9.9,4,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,83,0,259.7,106,152.7,116,224.7,92,10.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,47,27,165.0,89,127.3,118,284.4,95,7.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,0,1 +0,60,0,203.2,99,235.8,131,224.9,112,15.1,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,197,0,154.8,111,171.5,102,227.3,86,10.6,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,45,0,159.8,91,120.4,86,163.0,93,10.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,105,29,220.7,82,217.7,110,190.5,100,13.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +1,75,0,211.3,61,105.6,119,175.9,63,9.7,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,98,23,245.5,54,292.7,83,184.0,90,10.8,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,59,28,120.9,97,213.0,92,163.1,116,8.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,113,34,44.9,63,134.2,82,168.4,118,13.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +1,163,0,247.7,77,269.5,108,167.3,82,9.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,113,0,186.0,55,237.4,105,148.1,83,12.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,138,33,155.2,139,268.3,79,186.4,71,9.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,99,0,198.2,87,207.3,76,190.9,113,8.7,3,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,40,0,109.4,107,244.7,102,276.9,123,7.1,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,130,26,257.2,108,224.3,122,204.0,118,12.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,90,0,205.7,138,161.9,83,269.7,104,12.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +1,161,0,154.7,84,177.8,125,172.9,90,5.9,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,60,0,98.2,88,180.5,69,223.6,69,9.3,2,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,55,0,194.1,121,176.6,110,302.8,136,7.0,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,87,0,198.3,80,187.0,89,133.5,96,16.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,60,0,183.0,110,206.7,93,203.8,119,11.1,6,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,160,0,166.8,109,236.0,117,307.6,77,9.3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,24,0,243.0,91,183.9,77,184.3,109,15.3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,44,0,228.1,121,276.5,79,279.8,77,9.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,66,0,34.0,133,278.6,61,129.6,120,11.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,29,179.9,97,189.2,89,164.3,76,12.8,7,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,76,0,198.3,130,217.1,86,188.4,96,12.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,193,0,96.8,92,142.6,103,210.1,115,10.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,62,0,321.1,105,265.5,122,180.5,72,11.5,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,95,0,229.9,116,202.4,110,171.4,105,14.2,6,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,47,37,163.5,77,203.1,102,232.0,87,7.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,94,0,137.5,118,203.2,88,150.0,131,13.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,177,0,266.1,91,225.2,79,224.7,58,8.9,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,92,23,167.4,83,258.6,129,116.4,110,11.2,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,111,0,191.3,80,138.5,94,246.0,107,6.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,147,0,130.6,83,208.1,144,204.6,72,15.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,58,0,238.9,107,187.2,88,181.1,84,11.8,3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,149.8,112,180.0,93,140.0,119,11.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,145,0,241.7,137,135.8,100,277.6,123,13.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,93,19,136.8,113,179.5,105,71.1,95,12.5,3,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,95,0,228.9,134,255.7,71,208.0,120,10.1,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,156,0,150.5,106,152.9,112,215.9,86,3.5,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,75,24,225.5,119,182.0,108,270.9,106,9.4,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,67,41,174.7,86,160.6,93,155.3,108,13.4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,172,0,270.0,102,256.6,111,168.5,104,12.0,5,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,95,22,40.9,126,133.4,90,264.2,91,11.9,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,146,0,195.9,86,228.6,82,303.5,94,12.2,4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,210,0,104.6,121,149.5,71,255.1,67,6.5,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,0,218.9,88,208.0,85,203.3,99,11.1,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,150,0,178.9,101,169.1,110,148.6,100,13.8,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,37,0,134.9,98,248.4,130,236.2,113,14.7,2,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,126,0,228.7,102,168.7,99,223.5,100,11.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,97,0,276.1,82,201.1,106,231.3,73,8.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +1,76,0,129.7,84,177.5,80,228.9,87,7.5,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,138,37,75.8,102,173.6,147,162.6,96,8.2,13,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,129,0,186.0,127,262.3,96,98.9,63,11.5,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,122,0,168.3,96,87.6,91,247.2,87,8.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,96,0,170.5,86,277.5,88,162.5,117,12.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,68,24,125.7,92,275.9,98,214.5,108,14.2,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,185,0,55.6,97,288.7,83,111.2,110,12.1,3,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,0,238.0,97,164.5,97,282.5,132,10.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,83,36,95.9,87,261.6,105,228.6,109,13.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,111,0,214.3,118,208.5,76,182.4,98,12.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,85,0,209.8,82,194.5,94,200.4,85,11.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,120,0,192.6,123,206.4,105,283.2,93,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,88,0,138.3,116,236.0,138,179.1,110,9.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,40,0,170.7,55,179.1,108,281.9,89,8.2,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,153,0,166.8,127,143.5,121,210.7,130,11.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,50,0,154.7,102,298.0,108,210.2,95,11.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,61,0,197.7,118,152.2,96,221.0,93,7.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,161.1,99,198.8,81,228.4,116,10.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,205,0,49.9,123,150.7,81,188.2,67,10.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,99,0,222.4,102,185.8,89,237.7,81,12.0,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,194.4,83,247.8,84,245.4,93,11.2,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,112,0,115.8,108,243.3,111,184.6,78,13.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,0,51.8,107,230.2,104,227.5,118,10.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,90,29,150.1,109,264.7,103,178.4,97,5.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,1,0,196.1,107,296.5,82,211.5,91,7.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,155,0,203.4,100,190.9,104,196.0,119,8.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,0,158.8,75,264.8,91,270.0,77,7.6,7,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,78,32,210.3,116,192.2,83,246.1,92,10.8,4,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,63,0,185.3,87,225.3,87,194.3,93,11.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,58,0,131.9,96,167.6,107,205.9,106,14.7,5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,113,0,61.2,111,92.3,88,197.4,114,13.7,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,83,0,178.8,102,167.9,84,178.9,65,8.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,33,146.6,87,114.8,59,220.4,99,2.9,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,100,0,68.5,110,337.1,115,205.2,99,12.1,9,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,90,29,185.6,106,219.7,113,152.1,120,11.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,80,0,189.1,122,223.2,92,269.0,116,13.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,132,0,193.3,106,128.3,94,162.1,119,11.6,4,5,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,79,0,213.6,110,234.9,121,229.6,157,8.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,0,158.7,90,198.4,117,181.1,76,10.5,4,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,141,32,322.4,92,283.2,107,209.5,111,6.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,128,0,268.1,95,120.5,126,220.8,121,14.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,109,0,218.9,105,299.9,87,158.6,110,11.3,4,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,0,214.3,145,268.5,135,241.2,92,10.8,13,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,21,153.2,112,263.3,110,135.0,85,11.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,84,0,217.1,99,236.0,68,118.3,120,9.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,162,0,49.2,121,143.9,136,203.0,97,12.1,13,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,81,0,166.2,102,217.6,112,220.2,68,13.2,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,63,0,128.7,78,240.8,133,237.7,121,12.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,69,0,185.3,91,219.1,88,243.6,107,5.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,80,0,268.7,120,301.0,147,167.0,140,5.8,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,79,0,261.7,97,210.6,48,256.7,83,6.0,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,78,25,197.4,73,295.7,113,211.7,73,13.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,79,21,264.3,79,202.8,118,173.4,92,6.3,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,144,48,189.8,96,123.4,67,214.2,106,6.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,105,29,179.4,113,275.4,100,246.1,105,10.0,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,73,0,160.1,110,213.3,72,174.1,72,13.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,145,43,257.7,97,162.1,95,286.9,86,11.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,59,0,160.9,95,251.2,65,273.4,97,5.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,0,142.3,79,158.0,113,177.5,75,6.0,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,74,0,221.1,124,110.8,94,240.1,112,10.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,177.1,112,194.0,112,146.7,108,5.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,70,0,104.7,112,82.2,104,169.4,110,15.8,7,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,37,205.0,94,165.4,103,185.0,81,11.7,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,69,31,194.9,63,191.6,90,153.0,129,13.2,2,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,86,0,176.3,79,259.2,97,287.4,78,6.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,135,27,273.4,141,154.0,99,245.8,112,12.3,6,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,91,0,123.8,107,319.0,125,237.6,78,7.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,93,0,266.0,120,130.1,84,165.8,63,13.1,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,84,0,280.0,113,202.2,90,156.8,103,10.4,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,115,29,222.6,81,190.3,109,201.2,87,11.5,2,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,84,0,159.5,125,247.1,90,187.9,82,7.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,84,0,169.5,96,157.6,94,98.2,70,10.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,161,0,221.7,95,193.0,82,194.1,113,6.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,155,0,181.4,111,167.7,92,168.5,122,11.3,3,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,26,217.2,138,145.5,111,280.7,76,9.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +1,77,44,103.2,117,236.3,86,203.5,101,11.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,72,0,272.4,88,107.9,125,185.5,81,12.7,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,19,0,237.7,98,207.1,121,182.2,95,4.5,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,138,26,183.9,83,240.7,93,185.7,125,15.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,132,0,181.1,121,314.4,109,246.7,81,4.2,9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,61,16,143.5,76,242.6,58,147.7,95,11.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,120,24,227.5,81,234.9,71,166.4,128,9.0,13,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,204,0,174.3,85,254.1,95,176.4,96,5.9,3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,83,20,95.0,89,167.9,92,200.6,79,11.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,108,0,193.3,126,154.7,85,174.8,98,9.4,6,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,0,159.1,114,231.3,117,143.2,91,8.8,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,0,204.9,107,135.2,102,208.2,106,10.4,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,70,0,7.9,100,136.4,83,156.6,89,12.1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,100,0,219.4,112,225.7,102,255.3,95,12.0,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,1,0 +0,123,27,218.7,79,163.4,78,173.8,116,15.0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,173,0,291.8,143,214.3,134,151.2,119,9.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,153,28,235.6,74,227.9,37,170.3,103,15.4,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,0,1 +0,76,0,204.2,100,292.6,139,244.3,105,10.5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,83,41,155.9,122,162.3,107,127.6,105,13.1,5,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,115,14,192.3,86,88.7,90,229.4,120,10.5,3,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,20,0,190.0,109,258.2,84,181.5,102,6.3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,34,14,151.5,100,248.7,126,199.8,120,10.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,60,0,221.1,106,178.6,48,202.7,90,7.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,184,0,213.8,105,159.6,84,139.2,137,5.0,10,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,174,0,192.1,97,169.9,94,166.6,54,11.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,88,27,93.4,106,252.0,92,189.0,104,10.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,225,0,182.7,142,246.5,63,218.0,103,8.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,73,0,213.0,95,188.8,104,136.2,89,13.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,93,0,179.5,121,191.9,131,165.5,125,12.0,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,0,142.2,107,262.4,84,139.2,99,10.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,128,0,216.0,111,153.7,115,227.0,74,12.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,122,0,230.9,132,243.2,99,182.4,57,11.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,41,0,237.8,92,223.5,155,217.4,90,10.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,17,167.9,114,243.7,93,211.9,114,9.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,60,0,205.9,97,277.4,117,202.0,139,11.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,1,0 +0,65,0,158.8,53,188.5,132,189.3,87,9.8,4,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,76,0,186.1,96,211.6,100,230.6,100,8.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,40,0,169.7,115,141.4,123,253.0,115,10.5,3,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,67,28,95.0,94,291.2,73,159.6,114,10.0,2,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,160,0,82.7,116,194.6,95,159.0,54,10.9,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,82,0,154.0,107,94.4,114,287.6,95,10.1,7,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,102,25,137.4,100,176.7,83,188.2,93,10.2,6,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,97,0,236.9,107,157.6,105,241.0,120,7.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,1,0 +0,91,39,169.8,105,65.2,116,144.4,92,10.9,4,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,102,0,129.5,56,354.2,118,145.5,93,10.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,0,158.0,110,197.0,103,154.9,132,10.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,190,0,111.9,55,223.0,124,243.2,81,10.0,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,98,0,136.1,82,156.3,118,158.8,83,10.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,150,35,139.6,72,332.8,170,213.8,105,8.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,141,0,126.9,98,180.0,62,140.8,128,8.0,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,41,37,239.8,110,221.9,115,189.1,100,7.3,1,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,98,31,181.6,112,220.7,100,236.3,121,12.9,4,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,130,0,154.2,119,110.2,98,227.4,117,9.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,111,0,123.1,88,213.9,84,184.9,88,12.0,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,105,0,228.4,100,145.1,108,245.3,140,7.7,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,95,0,165.5,84,286.2,112,198.9,89,11.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,131,0,196.1,89,185.5,87,250.0,132,5.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,194,0,162.3,88,213.7,118,192.1,81,10.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,114,0,187.8,109,154.6,97,213.9,102,10.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,13,0,193.2,89,194.4,90,186.5,104,9.7,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,113,0,272.1,111,268.5,118,213.8,105,8.5,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,154.3,107,183.0,111,54.0,134,10.9,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,44,152.0,95,274.9,73,162.4,121,10.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,98,0,126.3,102,166.8,85,187.8,135,9.4,2,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,155,0,262.4,55,194.6,113,146.5,85,8.3,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,112,16,221.6,110,130.2,123,200.0,108,11.3,3,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,125,0,212.3,89,215.4,127,186.8,73,11.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,68,39,142.0,140,241.6,89,302.0,72,11.3,5,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,87,0,171.6,119,205.0,107,170.6,114,13.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,61,20,254.4,133,161.7,96,251.4,91,10.5,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,81,33,161.6,117,123.0,90,261.3,101,12.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,142,0,232.1,102,168.2,110,197.3,120,9.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,159,23,153.6,93,216.9,88,161.3,91,12.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,161,0,189.6,78,267.4,117,184.5,137,1.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,1,0 +0,104,0,160.4,73,293.9,103,306.6,90,12.6,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,55,0,269.6,121,171.7,91,219.0,98,8.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,117,0,102.3,100,135.2,104,199.7,93,15.7,10,3,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,25,96.2,112,178.9,70,182.1,84,12.9,10,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,59,0,179.4,80,232.5,99,175.8,105,14.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,113,23,149.0,104,235.8,67,201.8,76,9.5,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +0,100,0,191.9,95,200.9,101,271.9,74,18.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,77,0,62.4,89,169.9,121,209.6,64,5.7,6,5,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,107,0,241.9,102,126.9,117,185.6,92,11.7,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,203.4,110,128.7,97,190.5,113,11.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,89,0,115.4,99,209.9,115,280.9,112,15.9,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,97,0,143.7,117,273.0,82,178.3,81,10.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,147,0,209.4,104,132.5,78,149.4,123,11.3,3,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,93,0,152.4,74,274.6,88,252.2,120,6.6,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,62,0,245.3,91,122.9,130,228.4,102,8.5,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,104,0,225.9,123,162.8,106,272.1,85,10.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,165,0,242.9,126,209.8,65,228.4,126,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,93,0,271.6,71,229.4,108,77.3,121,10.9,3,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,39,0,160.4,68,102.6,103,235.3,106,9.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,86,0,226.3,88,223.0,107,255.6,92,13.0,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,36,0,157.6,117,184.3,58,240.4,99,11.9,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,147,38,243.4,126,273.8,109,282.9,91,14.1,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,85,0,165.8,96,190.0,141,144.0,116,10.9,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,138,21,19.5,149,140.9,109,179.7,111,7.9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,55,0,285.7,124,230.9,106,230.7,140,14.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,28,0,159.7,79,216.7,131,206.7,116,9.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,166.9,98,221.8,77,243.9,114,12.8,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,23,232.4,97,186.0,88,190.5,128,12.3,3,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,66,0,154.0,133,198.9,121,151.9,100,9.5,3,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,31,0,177.3,129,152.8,105,162.9,92,5.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,75,0,222.4,78,327.0,111,208.0,104,8.7,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,104,18,182.1,66,213.6,65,193.0,108,13.4,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,103,0,190.9,62,226.6,53,230.1,96,7.8,3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,58,29,240.4,80,118.9,91,164.2,108,11.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,84,0,225.9,86,275.6,105,201.4,108,14.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,216.8,86,190.8,114,187.5,79,11.0,9,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,76,22,160.1,107,168.7,136,23.2,102,9.5,4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,118,0,253.2,122,201.0,78,195.3,108,9.7,7,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,48,37,211.7,115,159.9,84,144.1,80,12.2,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,132,0,190.1,105,182.2,116,279.8,105,13.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,79,0,156.9,109,122.2,87,189.1,103,11.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,225.0,81,176.9,63,194.3,110,7.1,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +1,68,0,148.5,126,219.4,125,198.5,121,14.5,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,1,0 +0,57,29,279.9,121,223.1,109,251.7,94,13.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,68,0,213.9,112,260.5,100,233.8,97,8.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,107,0,133.3,106,182.9,89,241.1,123,12.9,2,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,114,0,147.1,119,161.0,111,275.9,106,9.0,3,5,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,222.8,101,203.0,128,210.6,106,6.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,98,0,0.0,0,159.6,130,167.1,88,6.8,1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,174,0,235.5,108,142.3,143,316.7,131,12.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,132,31,174.5,101,245.6,105,172.8,76,10.3,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,76,0,203.6,61,161.7,127,175.9,97,8.4,3,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,117,0,287.4,118,259.6,84,153.2,86,10.0,3,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,67,0,115.5,70,252.2,143,208.9,91,7.5,6,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,29,244.3,140,322.3,89,166.8,83,10.6,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,122,30,230.1,108,287.6,76,177.1,85,6.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,112,0,335.5,77,212.5,109,265.0,132,12.7,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,136.4,104,202.5,110,230.7,86,11.5,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,88,0,202.2,86,216.8,93,239.4,99,11.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,148.4,95,193.8,98,206.0,106,6.9,6,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,62,33,186.4,84,201.0,136,286.7,103,11.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,151,17,214.7,97,138.5,90,169.1,44,8.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,113,0,239.7,47,282.9,110,238.4,88,8.7,3,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,0,293.0,88,160.6,101,143.9,87,10.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,0,193.7,108,186.6,98,223.0,100,11.6,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,132,15,154.6,128,245.6,106,148.6,90,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,66,40,141.7,87,268.3,89,241.3,68,8.5,7,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,115,0,195.9,111,227.0,108,313.2,113,13.2,1,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,38,25,142.4,106,313.7,109,126.6,117,13.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,89,0,137.9,96,192.6,63,255.7,125,11.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,99,0,62.9,81,231.0,64,168.9,121,8.5,5,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,66,32,187.8,117,129.8,90,132.3,113,12.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,60,0,180.3,67,208.0,68,181.2,101,12.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,157,30,194.3,107,243.2,108,322.2,114,7.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,106,31,197.4,125,123.4,110,115.6,101,12.3,4,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,16,0,174.7,83,280.8,122,171.7,80,10.5,8,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,60,0,193.9,118,85.0,110,210.1,134,13.2,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,89,0,134.9,59,156.0,152,197.5,112,10.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,13,0,58.4,121,262.2,64,159.0,115,11.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,0,168.6,87,259.2,105,279.8,123,7.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,73,0,286.4,109,178.2,67,214.2,152,10.7,14,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,0,208.4,120,174.4,99,310.7,105,11.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,177,27,230.2,106,196.1,78,215.4,108,10.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,123,27,198.7,127,249.0,105,173.2,124,12.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,62,42,137.3,95,184.2,94,231.4,70,10.2,3,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,78,0,149.7,119,182.2,115,261.5,126,9.7,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,206.1,49,224.6,115,256.7,74,13.0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,79,0,157.6,85,194.1,92,231.5,86,9.4,10,5,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,131,39,69.1,122,101.3,136,104.8,94,9.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,185,30,154.1,114,118.7,106,258.4,105,12.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1 +0,81,36,115.9,120,236.6,95,255.0,90,11.7,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,77,24,149.4,74,123.9,72,174.3,84,10.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,141,28,206.9,126,264.4,126,171.8,124,9.3,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,116,27,175.5,137,210.6,60,294.8,121,6.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,172,0,169.8,123,183.1,94,395.0,72,12.7,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,136,21,179.4,88,181.1,97,320.7,120,9.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,0,234.4,61,179.3,111,285.5,117,10.4,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,96,45,248.8,124,140.3,77,263.6,102,10.3,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,155,23,243.9,112,133.0,106,213.7,123,13.4,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,168,0,183.2,131,179.2,73,292.8,100,9.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,16,0,153.2,65,229.7,90,148.2,94,10.7,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,148,0,243.0,115,191.8,91,117.8,93,13.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,122,0,144.2,87,212.2,74,169.3,87,9.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,96,29,150.0,91,159.4,75,228.1,55,8.5,3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,50,22,252.9,112,177.9,99,158.4,146,8.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,64,0,194.2,147,173.4,87,268.7,114,5.5,2,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,35,230.5,116,265.8,130,269.7,69,10.6,6,5,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,90,30,183.8,76,229.7,95,144.1,124,7.7,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,69,33,271.5,98,253.4,102,165.4,85,8.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +1,95,0,269.0,120,233.7,120,179.3,61,7.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,37,0,271.7,112,155.1,96,199.5,97,6.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,87,0,177.2,72,248.9,105,200.8,87,8.6,7,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,171.7,99,174.8,87,189.6,130,7.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,75,0,150.6,99,301.5,83,158.7,104,8.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,11,38,209.8,130,196.6,84,233.0,79,7.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,79,0,236.8,135,186.4,87,126.9,112,10.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,54,24,92.3,88,193.1,98,99.3,119,11.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,117,0,191.1,93,282.8,56,84.8,118,12.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +1,149,0,119.2,88,168.3,110,204.7,119,12.2,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,40,202.6,103,118.8,128,234.9,98,9.0,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,65,0,207.7,109,217.5,117,125.6,111,8.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,82,0,300.3,109,181.0,100,270.1,73,11.7,4,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,220.3,124,188.6,101,278.4,98,10.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,120,27,128.5,115,163.7,91,242.9,121,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,182,0,104.9,111,198.5,120,258.2,91,8.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,151.5,89,131.7,78,235.3,131,11.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,117,17,221.3,82,167.6,100,262.7,87,4.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,78,0,236.8,141,265.3,101,152.4,77,9.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,78,0,208.9,119,252.4,132,280.2,120,12.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,0,157.0,113,256.9,97,185.5,126,12.1,2,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,0,204.4,135,219.1,90,222.7,114,10.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,1,0 +0,70,0,147.1,105,200.0,135,234.9,65,12.5,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,54,0,210.5,102,204.5,83,127.8,53,8.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,88,45,80.3,140,153.3,101,309.2,123,12.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,92,0,157.7,101,298.6,100,216.9,99,13.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,149,0,166.6,61,218.8,107,208.3,131,8.2,6,7,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,122,27,253.7,84,229.2,109,190.5,123,9.2,5,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,145,24,147.5,90,175.7,108,252.1,102,15.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,117,0,119.0,82,187.5,108,189.3,97,11.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,133,0,216.2,67,222.2,133,192.0,95,3.1,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,111,0,284.4,89,157.0,113,242.8,91,8.4,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,160,0,234.9,136,270.8,134,219.3,101,13.9,2,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,159.5,77,303.8,92,226.9,120,12.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,127,25,82.2,95,163.3,109,264.9,104,5.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,138,0,268.4,81,174.4,115,193.5,96,11.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,65,0,195.4,110,181.2,109,178.5,105,8.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,80,0,127.8,67,181.6,112,197.3,63,15.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,1,26,208.0,115,185.0,113,177.7,144,8.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,87,0,228.7,90,163.0,99,154.1,90,11.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,92.8,92,159.6,87,148.7,115,8.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,89,25,215.1,140,197.4,69,162.1,117,10.6,10,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,153,0,228.9,102,160.7,136,203.1,109,12.5,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,29,0,313.2,103,216.3,151,218.4,106,12.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,63,32,30.9,113,187.0,113,230.8,101,8.6,7,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,73,0,187.8,95,149.2,143,201.4,113,11.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,42,0,184.5,98,200.5,93,279.2,91,8.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,221.8,97,203.8,134,215.8,154,8.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,105,0,211.1,99,176.7,66,221.5,96,14.7,7,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,97,32,90.0,87,276.3,113,185.2,107,8.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,121,0,103.3,110,129.1,82,167.1,113,10.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,15,0,135.2,101,152.5,79,224.8,83,8.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,0,199.3,86,194.8,102,298.2,82,14.3,2,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,130,0,271.8,129,237.2,128,210.1,91,8.7,2,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,0,214.2,90,196.8,78,157.9,112,5.9,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,120,0,98.2,99,186.7,85,146.7,96,9.3,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,94,28,233.2,88,113.3,102,118.0,71,16.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,138,0,230.1,107,212.0,120,174.9,119,13.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,149,0,176.2,87,145.0,81,249.5,92,5.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,88,0,181.9,90,151.5,87,143.0,100,7.5,3,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,253.2,89,237.9,114,154.3,85,9.7,7,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,209,0,227.2,128,258.4,92,183.5,74,8.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,134,38,214.4,93,211.7,57,165.0,79,10.0,8,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,116,0,189.5,90,189.8,118,205.8,83,13.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,32,26,266.7,109,232.3,107,212.8,98,16.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,0,1 +0,129,0,207.0,91,154.9,121,245.1,112,13.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +1,85,17,89.8,88,233.2,75,165.7,116,9.3,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,54,33,161.8,73,273.0,58,153.9,76,13.7,4,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,0,157.4,107,167.8,112,188.8,102,8.8,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,110,0,148.5,115,276.4,84,193.6,112,12.4,3,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,37,0,233.7,114,207.9,109,212.7,101,12.0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,101,36,123.7,125,172.6,106,280.5,127,8.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,95,0,175.2,91,244.4,109,75.8,95,7.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +1,93,0,312.0,109,129.4,100,217.6,74,10.5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,117,0,168.8,137,241.4,107,204.8,106,15.5,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,66,0,118.0,133,248.1,99,214.4,122,5.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,142,40,230.7,101,256.8,88,263.9,92,6.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,140,27,188.9,124,160.9,102,197.7,100,11.5,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,68,22,82.5,97,289.9,94,180.0,114,4.8,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,2,0,132.1,42,138.9,88,192.6,119,9.1,1,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,83,32,94.7,111,154.4,98,200.4,109,10.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,154,35,224.0,102,192.0,99,163.1,100,9.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,147,0,124.4,74,320.9,78,157.2,126,10.4,4,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,206.9,85,244.7,78,221.5,136,7.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,131,0,263.4,123,151.9,74,218.5,101,10.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,188,26,198.8,115,166.6,67,198.5,118,14.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,127,0,139.6,94,240.9,112,127.1,88,8.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,106,30,220.1,105,222.2,109,158.4,96,13.1,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,0,254.3,113,78.9,104,153.2,69,11.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,179,0,219.2,92,149.4,125,244.7,104,6.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,128,18,222.1,89,160.6,109,218.8,102,13.6,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +1,76,0,241.0,120,231.8,96,220.2,67,9.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,93,0,151.4,89,186.4,76,172.5,120,10.9,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,124,0,194.0,103,241.0,116,227.5,153,11.9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,130,0,212.8,102,189.8,137,170.1,105,10.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,29,215.5,129,161.9,77,128.3,91,8.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,86,0,162.4,131,167.0,102,128.9,118,11.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,22,0,181.8,108,198.6,148,206.6,96,9.3,3,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,102,0,224.2,81,243.3,90,147.8,66,12.0,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,93,0,173.0,131,190.4,108,290.0,66,10.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,201.4,52,229.4,104,252.5,106,12.0,3,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,112,16,200.3,72,197.8,91,151.1,92,10.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,137,0,215.9,76,145.4,118,186.9,129,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,144,0,150.0,69,285.9,73,190.6,121,9.4,15,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,94,38,170.1,124,193.3,116,105.9,73,12.8,4,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,129,32,211.0,99,155.1,89,234.8,96,11.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,40,0,242.5,82,232.9,97,154.0,86,9.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,118,35,256.3,119,258.1,91,215.5,130,11.7,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,115,0,200.2,92,244.9,107,190.9,96,8.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,167,0,244.8,91,60.8,105,176.7,110,10.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,139,0,165.0,132,249.7,86,170.3,128,12.6,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,34,142.3,73,194.8,79,239.3,81,16.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,67,0,120.9,58,235.0,88,95.1,130,11.4,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,166,0,173.9,103,276.4,83,190.8,113,15.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,74,0,203.8,77,205.1,111,154.9,109,9.0,2,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,145,0,39.5,78,264.3,106,185.8,90,10.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,224,0,111.4,133,175.0,66,217.2,106,5.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,115,0,178.7,114,271.0,96,245.9,94,16.4,5,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,114,0,193.0,101,250.0,81,133.3,79,9.6,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,119,23,154.0,114,278.0,137,228.4,112,11.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,106,0,208.3,89,169.4,67,102.0,90,15.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,137,0,258.0,112,246.5,117,173.2,100,10.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,32,26,243.5,137,236.8,108,173.3,149,9.0,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,74,0,85.7,83,247.7,67,142.4,85,10.1,5,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,108,42,130.1,90,167.0,128,244.7,80,13.6,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,110,0,178.5,124,146.9,141,217.1,102,9.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,115,0,166.5,111,236.2,98,205.6,92,15.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,114,0,172.0,145,276.4,101,193.7,100,10.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,86,0,216.3,96,266.3,77,214.0,110,4.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,78,0,108.6,108,209.9,126,222.6,117,7.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,136,0,101.7,105,202.8,99,136.2,119,9.4,6,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,224,0,171.5,99,160.0,103,212.4,102,5.0,2,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,153,0,154.6,56,263.0,84,367.7,89,15.5,2,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,99,0,242.3,102,350.9,102,163.1,93,11.3,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,127,0,202.1,103,229.4,86,195.2,113,11.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,133,39,239.9,107,253.8,77,128.7,85,6.7,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +0,119,0,98.8,97,146.9,68,190.7,105,10.0,4,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,36,96.3,83,179.6,91,166.3,121,10.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,116,17,193.4,112,240.6,131,248.1,98,11.4,3,5,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,78,0,162.3,116,192.4,86,240.6,100,10.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,141,0,151.5,104,242.2,114,304.2,109,10.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,114,0,155.3,75,169.9,87,207.0,133,12.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,0,218.0,86,184.0,94,240.5,110,6.4,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,140,0,162.6,98,206.2,109,141.6,66,8.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,183,0,190.0,100,246.6,78,304.2,107,9.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,63,25,190.0,137,116.6,76,141.5,110,12.2,2,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,99,31,244.1,71,203.4,58,234.0,115,7.7,4,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,154,0,350.8,75,216.5,94,253.9,100,10.1,9,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,159,0,114.8,98,192.6,101,259.0,108,12.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,80,0,149.8,123,276.3,75,241.4,75,10.9,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,105,0,119.3,82,185.1,111,157.0,74,10.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,52,31,142.1,77,193.0,97,253.4,88,11.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,64,0,206.2,76,232.4,76,251.6,96,13.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,189,0,227.4,84,176.0,81,206.1,120,6.3,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,102,0,200.6,106,152.5,127,199.4,128,7.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,86,0,125.5,139,269.8,93,235.8,110,8.9,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,0,61.9,78,262.6,114,212.5,110,8.8,2,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,123,0,194.0,118,242.0,114,146.3,108,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,0,86.1,100,259.8,113,148.0,79,9.1,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,0,220.2,108,188.4,124,172.7,113,11.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,0,247.8,117,130.0,95,134.3,125,6.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,0,133.5,51,219.6,96,210.0,74,11.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,85,29,144.6,97,140.0,102,165.4,148,10.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,141,0,185.1,126,233.0,98,152.2,106,9.1,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,183,0,108.3,87,183.6,116,176.6,109,13.5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,92,0,173.1,140,240.3,105,233.2,117,9.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,48,0,190.4,92,317.5,85,133.4,113,8.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,21,160.6,85,223.1,79,124.0,92,9.5,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,170,42,199.5,119,135.0,90,184.6,49,10.9,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,101,9,160.1,116,210.0,121,139.1,65,10.8,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,0,1 +1,110,0,293.3,79,188.5,90,266.9,91,14.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,108,41,171.6,110,136.1,78,183.4,103,10.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,123,0,236.2,135,273.9,88,227.0,77,10.1,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,22,263.8,65,103.4,115,208.1,109,8.5,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,90,27,156.7,51,236.5,118,123.2,111,12.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,87,0,115.4,90,262.6,68,245.7,69,13.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,162,0,70.7,108,157.5,87,154.8,82,9.1,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,102,0,273.2,85,211.1,82,203.7,129,13.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,112,0,206.2,122,164.5,94,140.3,101,12.6,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,78,0,220.0,95,179.9,121,188.2,109,11.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +1,124,0,312.0,112,180.0,109,168.6,94,12.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,75,42,248.9,93,170.8,108,104.5,91,11.2,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,152,0,140.5,92,186.8,96,227.0,89,9.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,127,36,183.2,117,126.8,76,263.3,71,11.2,8,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,99,0,190.4,102,158.1,107,271.5,92,11.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,50,0,131.1,129,160.5,94,206.9,88,5.6,9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,98,0,169.9,77,138.3,155,142.6,105,8.5,7,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,59,31,225.0,78,191.3,79,226.7,79,9.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,45,0,112.8,108,218.8,120,240.2,106,9.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,106,0,158.7,74,64.3,139,198.5,103,10.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,67,0,176.2,120,236.0,138,152.5,104,10.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,3,27,67.4,116,244.0,78,281.1,93,11.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,96,23,183.1,88,147.4,89,350.2,108,11.3,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,55,45,130.5,114,208.4,94,141.6,114,11.0,5,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,98,36,168.0,81,163.2,125,172.7,120,8.0,2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,119,0,176.8,90,224.7,81,204.6,77,7.5,15,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,127,0,245.2,91,217.2,92,243.1,128,13.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,114,0,169.6,85,58.9,86,179.3,124,7.4,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,92,25,134.0,112,206.0,111,180.6,118,9.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,36,30,146.3,128,162.5,80,129.3,109,14.5,6,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,122,33,174.9,103,248.2,105,164.6,116,13.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,88,0,183.5,93,170.5,80,193.8,88,8.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,175.9,105,188.3,88,188.3,98,11.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,78,0,190.3,88,194.5,89,256.5,109,11.7,5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,44,0,62.3,92,275.0,82,138.7,108,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,1,0 +0,128,29,179.3,104,225.9,86,323.0,78,8.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,1 +0,134,0,7.8,86,171.4,100,186.5,80,12.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,0,201.1,101,170.7,86,237.4,113,11.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,102,0,224.7,81,129.4,112,167.6,109,15.8,6,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,163,22,215.1,91,138.9,102,146.2,109,12.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,112,0,261.4,108,154.5,102,130.9,90,11.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,135,0,151.7,82,119.0,105,180.0,100,10.5,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,61,15,252.4,106,187.8,69,259.6,137,10.0,3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,123,28,124.7,105,250.4,78,216.4,128,7.8,8,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,122,0,140.1,120,231.4,128,188.1,127,11.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,100,0,142.5,87,195.7,88,122.1,117,7.8,8,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,106,0,187.1,104,250.2,117,144.9,81,11.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,75,0,305.1,106,188.0,115,235.4,116,8.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,95,0,128.6,115,216.2,88,255.3,96,6.3,2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,180,0,143.5,121,189.3,111,174.9,82,8.8,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,25,216.0,140,224.1,69,267.9,112,11.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,137,0,135.1,95,134.1,102,223.1,81,12.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,17,31,153.1,115,185.9,59,224.3,102,10.0,1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,179,0,287.3,123,288.0,114,266.0,112,10.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,62,0,159.9,100,172.2,99,263.2,109,5.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,118,0,205.2,115,184.8,137,176.1,115,7.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,0,158.6,104,211.2,77,179.3,104,10.2,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,157,0,240.2,67,153.0,98,249.0,72,10.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,84,38,193.0,106,153.6,106,260.4,87,7.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,92,45,281.1,88,198.0,103,94.3,76,7.5,3,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,88,0,55.6,65,242.7,121,176.3,134,11.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,0,183.1,99,160.1,107,311.8,121,7.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,100,0,159.9,94,179.9,95,154.4,102,11.6,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,107,0,157.1,79,162.6,124,150.0,138,12.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,166,0,197.9,89,251.0,113,138.3,85,11.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,122,40,216.4,80,249.7,90,185.9,99,12.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,114,19,154.6,100,241.6,109,160.0,112,12.6,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,67,20,230.6,40,189.1,58,162.2,115,9.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,147,0,212.8,79,204.1,91,156.2,113,10.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,130,0,203.9,63,191.8,93,132.5,125,12.1,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,47,28,172.9,109,137.6,94,203.8,109,8.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,3,0,185.0,120,203.7,129,170.5,89,14.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,28,283.1,93,185.4,98,312.8,78,6.1,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,152,0,161.4,84,163.6,88,153.2,121,11.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,105,0,213.4,100,204.9,52,179.7,93,9.5,6,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,113,30,183.8,102,183.4,123,235.0,52,11.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,61,0,197.3,67,264.5,106,210.5,116,9.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,62,0,159.7,86,197.5,76,121.6,105,13.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,63,0,211.7,107,271.7,77,203.3,108,7.4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,27,0,232.1,81,210.8,101,165.4,87,15.0,6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,66,26,254.9,108,243.2,135,190.8,95,5.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,124,0,178.3,102,235.0,120,239.7,119,10.9,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,63,0,211.2,80,237.7,93,259.2,58,12.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,0,109.0,69,265.8,98,228.3,80,12.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,122,0,232.5,96,205.5,120,213.7,91,11.9,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,143,0,155.5,101,213.4,89,237.9,61,7.6,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,71,31,115.4,90,217.4,78,239.9,102,13.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,48,22,152.0,63,258.8,131,263.2,109,15.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,94,0,234.4,103,279.3,109,234.2,121,2.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,143,0,202.8,109,165.8,104,143.9,71,4.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,15,0,141.4,80,123.9,76,323.5,88,8.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,109,0,184.0,120,120.4,119,153.7,86,11.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +1,113,0,209.4,151,347.3,113,246.0,116,7.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,168,22,175.9,70,211.7,105,174.5,81,7.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,94,0,181.8,85,202.4,98,245.9,97,9.2,2,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,57,0,221.1,101,236.7,65,252.3,137,9.5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,44,0,288.1,112,258.0,92,192.4,90,10.2,4,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,74,27,154.1,122,195.3,150,276.7,86,13.2,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,151,0,175.3,106,144.3,87,160.2,88,11.8,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,196,0,133.1,80,206.5,120,221.6,96,10.3,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,36,190.3,115,256.6,78,214.9,145,3.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,136,0,109.4,91,207.5,111,135.0,107,11.6,5,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,95,0,183.4,98,281.3,95,105.2,113,8.2,8,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,100,0,96.5,86,210.2,133,146.4,106,12.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,155,21,195.9,91,213.9,84,88.2,111,8.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,171,25,223.2,77,183.2,118,150.8,90,10.2,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,43,0,230.2,147,186.7,121,128.4,100,9.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,131,36,214.2,115,161.7,117,264.7,102,9.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,174,0,139.4,96,143.4,108,225.2,107,10.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,94,0,243.2,109,147.0,88,94.9,99,7.2,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,168,42,97.4,57,203.6,98,173.9,124,11.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,72,29,139.8,114,138.2,91,221.0,88,5.5,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1 +0,62,0,120.7,70,307.2,76,203.0,99,13.1,6,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,261.9,113,148.1,99,145.2,74,13.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,44,0,308.6,139,150.8,94,198.7,66,7.3,3,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,234.8,85,140.9,91,204.3,93,9.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,74,0,298.1,112,201.3,100,214.7,88,9.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,142,26,220.5,94,239.5,126,254.3,109,5.9,9,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,201,21,192.0,97,239.1,81,116.1,125,15.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,128,32,223.5,81,188.8,74,154.9,101,9.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +1,123,0,125.5,106,128.9,96,251.9,129,6.3,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,154,0,145.9,69,208.2,141,180.9,106,14.4,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,104,0,234.2,128,293.1,92,183.9,79,9.8,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,90,0,76.1,121,290.3,73,236.9,89,10.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,0,184.2,95,181.6,101,143.4,113,12.8,4,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,126,0,190.9,143,149.7,72,191.4,87,13.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,69,0,227.0,122,258.7,111,169.7,87,8.9,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,112,0,266.0,97,214.6,94,306.2,100,14.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,259.8,85,242.3,117,168.8,72,5.4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,25,170.7,88,109.9,113,165.7,99,8.7,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,103,0,174.7,151,148.0,56,168.2,109,15.8,3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,56,0,150.9,79,161.8,87,167.7,115,11.7,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,97,0,215.3,58,242.4,91,279.8,105,12.1,9,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,118,26,170.8,114,199.5,125,169.7,98,9.6,5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,84,42,214.3,112,188.2,107,333.5,117,11.3,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,1 +1,84,0,289.1,100,233.8,97,223.5,148,12.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,135,0,144.1,115,249.8,68,211.4,82,13.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,143,0,110.1,113,169.0,59,166.7,94,9.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,116,0,229.3,93,184.5,111,168.2,91,8.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,58,0,243.1,105,231.4,108,180.9,120,7.8,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,8,36,242.9,67,170.9,59,177.3,130,4.8,12,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,156,0,277.0,119,238.3,106,94.4,96,8.3,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +1,159,0,189.1,105,246.1,147,242.0,106,10.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,136.1,120,204.2,103,228.2,90,11.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,47,37,112.8,150,243.9,97,178.7,112,13.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,83,0,337.4,120,227.4,116,153.9,114,15.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,1,0,182.1,106,134.9,106,152.3,75,10.0,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,15,160.0,95,209.5,110,82.3,107,8.7,5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,63,0,62.9,112,202.9,111,259.0,58,8.9,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,141,0,160.1,87,256.7,120,270.0,107,7.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,145,0,199.2,124,126.0,86,289.2,135,7.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,141,0,51.9,108,162.0,83,223.5,115,10.1,3,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,122,0,35.1,62,180.8,89,251.6,58,12.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,120,0,202.0,123,184.3,78,176.0,89,7.4,2,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,101,0,183.9,115,255.9,101,275.0,145,10.8,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,39,0,93.3,83,199.6,114,206.2,104,6.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,128,0,179.4,94,270.4,92,191.0,88,7.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,68,0,222.8,99,175.8,85,202.0,111,11.0,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,5,0,199.2,106,187.3,12,214.0,85,13.3,3,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,34,192.3,114,129.3,114,136.3,102,6.3,12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,66,0,208.7,84,173.3,88,264.7,107,8.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,1,0 +0,95,0,142.5,109,176.1,107,189.6,88,8.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,127,0,180.9,114,209.5,118,249.9,105,7.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,130,0,139.1,72,246.0,112,207.2,121,11.4,9,5,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,132,0,197.8,66,133.9,119,177.3,94,10.9,3,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,152,20,239.1,105,209.1,111,268.2,130,13.3,3,5,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,86,32,70.9,163,166.7,121,244.9,105,11.1,5,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,53,0,184.8,98,216.4,125,141.1,116,18.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,150,0,146.3,133,202.7,95,234.7,103,13.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,61.3,91,194.4,94,143.1,80,11.4,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,46,0,214.1,72,164.4,104,177.5,113,8.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,40,104.9,65,216.3,93,217.4,128,9.6,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,19,34,156.6,97,224.2,97,260.9,135,11.3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,113,0,156.0,141,256.8,72,175.3,123,11.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,81,0,175.5,67,249.3,85,270.2,98,10.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,88,0,161.5,92,173.5,108,206.2,95,7.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,223.4,98,220.6,101,203.9,118,6.3,6,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,118,39,91.5,125,219.9,113,229.0,99,12.7,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,106,0,193.6,66,238.2,82,176.4,107,12.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +1,71,0,278.9,110,190.2,67,255.2,84,11.7,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,0,136.1,112,272.9,96,220.2,104,4.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,71,0,290.4,108,253.9,92,263.3,126,10.1,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,148,26,158.7,91,160.5,127,218.3,88,9.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,122,0,180.0,88,145.0,77,233.7,120,11.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,58,39,211.9,40,274.4,76,210.5,139,5.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,0,1 +0,80,0,118.1,90,144.3,77,225.1,86,8.2,6,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,57,0,115.0,65,122.3,96,245.0,75,6.4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,91,0,134.7,116,295.3,98,195.5,121,6.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,13,0,146.4,74,148.5,92,216.7,96,11.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,67,0,201.4,101,97.6,122,202.5,119,7.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,63,0,58.9,125,169.6,59,211.4,88,9.4,3,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,119,24,217.2,94,138.7,52,139.3,85,11.3,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,110,38,236.3,102,195.9,112,183.5,82,9.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,77,23,209.7,73,183.6,63,205.5,111,7.1,3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,64,0,219.2,73,167.0,65,161.4,119,10.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,105,34,218.5,61,196.7,74,151.1,103,9.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,129,0,98.0,99,240.7,62,254.8,123,10.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,59,32,211.9,120,202.9,136,213.5,95,8.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,86,28,221.6,74,288.4,100,240.3,105,9.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,110,0,227.7,88,170.0,96,128.7,57,11.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,38,204.2,57,205.9,92,286.5,80,8.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,106,0,191.4,124,200.7,116,230.1,76,8.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,22,166.0,114,174.5,103,244.9,68,10.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,155,30,128.5,86,188.4,91,254.4,85,6.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,75,0,147.5,110,191.7,97,135.0,68,16.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,143,33,239.2,109,235.5,112,156.3,95,9.5,4,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,109,0,200.1,72,300.9,120,236.0,68,11.9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,30,186.2,117,286.7,76,164.3,113,12.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,67,0,129.0,78,188.0,116,235.0,102,11.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,0,150.3,101,255.9,112,136.7,62,12.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,81,0,203.5,89,289.6,69,212.9,71,8.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,73,0,157.1,109,268.8,83,181.5,91,10.0,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,66,0,205.1,102,232.7,109,259.9,95,9.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,36,16,149.4,111,131.8,113,132.7,87,6.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,149,0,196.3,108,136.8,96,154.7,87,7.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,92,0,252.3,120,207.0,112,284.6,95,12.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,1,0 +0,148,0,216.2,95,185.7,105,300.0,143,10.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,116,0,173.5,93,194.1,76,208.0,112,16.2,10,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,78,0,103.5,115,117.9,102,201.0,94,12.0,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,167.5,96,139.1,104,138.4,87,13.0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,192.6,102,178.9,118,214.6,74,9.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,61,0,234.2,76,216.7,108,130.6,122,13.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,76,0,173.2,93,131.2,80,170.9,104,5.4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,0,249.4,118,211.5,95,169.0,116,9.1,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,115,0,245.2,105,159.0,109,229.9,74,7.2,8,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,84,35,207.5,138,201.0,116,164.5,107,7.5,16,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,110,0,135.1,109,205.2,99,166.3,119,11.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,255.3,114,194.6,83,276.6,78,3.7,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,209,0,153.7,105,188.6,87,200.8,95,10.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,32,138.1,91,167.3,72,238.9,115,6.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,1 +0,43,0,179.3,97,252.7,126,227.5,114,8.0,5,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,21,0,223.2,142,216.5,114,214.7,111,12.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,74,0,282.5,114,219.9,48,170.0,115,9.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,62,0,86.3,84,238.7,99,238.4,79,12.5,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,84,0,190.2,102,197.7,141,247.5,102,9.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,84,0,138.6,102,199.0,93,204.1,137,7.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,76,0,246.8,110,206.3,63,208.4,123,13.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,1,0 +0,127,0,224.3,112,185.7,103,159.4,83,10.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,112,0,243.4,77,182.1,97,259.2,94,12.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,149,43,206.7,79,174.6,122,241.5,80,10.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,148,38,209.2,110,116.6,73,109.6,105,16.5,4,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,105,0,106.4,71,240.1,83,147.7,114,5.3,4,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,0,62.8,124,170.4,66,280.2,78,9.4,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,156,0,123.7,96,103.0,80,189.4,82,13.1,4,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,148,21,262.9,135,149.5,96,140.5,109,8.1,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,73,31,82.3,105,256.1,91,229.6,98,11.8,2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,27,0,82.6,105,204.0,99,224.2,122,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,95,0,157.9,103,259.6,90,230.0,117,14.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,62,0,100.0,98,173.5,95,218.0,122,10.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,78,0,193.1,85,172.1,105,129.6,119,10.2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,100,0,188.5,152,148.3,115,179.8,88,15.2,5,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,22,172.1,119,223.6,133,150.0,94,13.9,20,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,93,0,164.9,68,210.4,86,229.4,104,7.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,124,0,251.5,85,214.2,98,186.1,71,11.1,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,47,30,196.6,93,241.4,140,226.0,118,12.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,48,0,149.2,146,161.9,109,197.9,109,8.3,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,92,0,197.0,84,269.3,105,158.9,105,10.8,4,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,0,129.4,84,157.3,89,215.5,77,13.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,197.6,83,164.5,86,94.0,98,6.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,202.0,126,163.5,86,195.4,84,10.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,0,212.4,105,224.6,118,221.3,105,9.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,78,0,152.9,81,256.6,82,173.6,112,5.3,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,195,36,231.7,110,225.1,88,201.7,89,12.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,1 +0,170,0,184.1,106,204.9,70,224.3,133,9.8,3,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,99,0,197.2,127,156.0,92,204.1,99,9.9,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,126,0,197.6,126,246.5,112,285.3,104,12.5,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,1,1,0 +0,90,0,261.8,128,220.6,104,136.6,91,9.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,79,0,103.5,134,319.3,111,239.9,124,8.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,38,31,197.2,118,249.9,70,298.9,104,3.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,131,34,156.6,134,71.0,95,261.7,120,13.4,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,1 +0,46,0,156.4,105,185.5,98,226.7,96,11.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,130,0,150.4,119,230.5,99,186.3,76,12.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,187.2,127,195.6,88,181.8,129,5.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,80,0,220.0,114,207.7,76,168.4,137,12.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,147,0,205.3,95,166.7,128,240.6,84,7.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,145,0,229.6,82,138.1,103,250.8,109,3.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,109,0,175.4,125,250.7,87,289.3,74,9.8,9,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,123,0,123.2,104,190.0,117,170.3,95,12.9,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,46,0,210.2,92,227.3,77,200.1,116,13.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,143,24,186.6,69,222.0,116,234.9,138,11.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,120,23,221.9,114,254.7,84,250.5,117,7.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,190,26,116.7,71,145.9,88,175.1,103,9.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,217,0,176.4,115,158.8,128,306.6,107,9.3,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,112,29,198.8,122,238.6,114,289.5,69,11.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +0,90,0,203.4,146,226.7,117,152.4,105,7.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,134,40,142.9,105,88.6,61,290.0,96,10.8,6,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,109,0,162.6,138,154.0,109,209.7,118,11.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,144,0,177.5,93,287.4,75,180.5,118,11.9,3,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,41,0,207.2,138,214.1,83,193.0,105,11.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,150,0,169.2,123,216.8,83,179.4,107,12.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,107,0,123.1,100,158.4,82,256.1,82,9.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,123,0,175.7,78,184.6,96,156.9,92,9.1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,160,0,216.8,77,207.3,117,228.6,117,5.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,103,0,206.5,125,180.2,113,220.6,95,12.2,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,131,0,187.9,110,200.5,101,202.6,125,10.2,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,88,0,301.5,136,257.7,72,132.9,118,13.4,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,178.4,143,247.0,123,259.9,105,9.6,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,0,177.9,83,167.3,84,223.7,142,15.2,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,179.2,77,210.7,99,276.9,58,9.2,6,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,27,0,72.7,75,208.6,117,65.8,71,9.9,3,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,57,33,193.4,105,231.6,79,226.2,90,11.1,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,7,30,221.4,114,165.8,116,247.0,105,10.8,12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,90,39,94.8,89,219.1,91,197.4,65,11.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,1 +1,106,0,210.6,96,249.2,85,191.4,88,12.4,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,110,0,159.5,145,202.3,101,256.0,96,16.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,121,35,193.8,62,197.6,97,218.8,95,5.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,125,0,187.3,118,160.7,111,263.8,112,9.6,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,61,25,163.7,78,113.2,112,134.1,118,9.9,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,92,0,255.8,125,142.7,111,181.2,101,11.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,93,0,190.7,114,218.2,111,129.6,121,8.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +1,107,0,294.9,71,192.8,78,148.1,87,13.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,204,0,205.2,145,154.8,95,191.4,77,14.1,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,83,0,132.4,120,121.6,101,197.7,84,8.6,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,19,0,201.5,123,129.2,110,220.6,98,12.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,25,265.1,110,197.4,99,244.7,91,10.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,65,31,282.3,70,152.0,89,225.5,93,12.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,146,0,111.1,126,313.4,95,215.7,82,10.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,1,0 +0,32,31,232.8,97,183.5,111,206.8,111,13.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,83,0,271.5,87,216.3,126,121.1,105,11.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,66,36,106.7,76,209.8,77,190.4,117,12.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,121,0,207.9,98,210.5,96,109.6,114,7.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,119,26,132.0,100,173.3,121,203.5,108,11.6,5,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,185,19,157.3,123,257.7,94,190.4,107,9.6,6,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,97,0,239.8,125,214.8,111,143.3,81,8.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,107,0,146.9,94,114.3,111,114.5,97,11.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,102,0,144.4,87,266.5,128,217.6,59,7.1,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,205.2,97,240.6,77,79.7,108,14.4,12,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,113,0,72.5,88,204.0,112,117.9,118,6.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,97,43,121.1,105,260.2,115,222.4,100,8.3,5,3,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,59,0,182.5,104,204.7,95,229.9,100,11.3,8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,126,0,103.7,93,127.0,107,329.3,66,14.4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,0,218.6,93,149.9,130,204.6,131,9.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,104,0,280.4,127,179.4,79,150.6,77,15.2,6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,0,161.5,121,192.9,137,168.3,96,11.2,13,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,1,0 +0,117,0,102.8,119,206.7,91,299.0,105,10.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,35,0,124.2,102,123.9,115,135.7,100,13.1,8,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,73,0,137.1,102,210.8,114,191.4,120,11.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,0,213.4,111,234.5,94,250.1,123,2.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,145.6,103,197.1,137,294.5,83,10.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,93,21,134.2,105,162.5,128,186.6,90,11.8,2,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,115,0,122.0,110,220.2,100,179.7,124,10.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,103,31,185.4,105,197.6,126,147.1,110,14.5,4,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,87,28,143.5,106,223.5,147,175.4,69,11.2,5,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,138,0,170.5,87,118.2,116,187.9,111,11.2,7,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,46,0,139.4,81,223.7,113,173.1,77,13.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,149,0,207.3,115,198.4,82,114.1,83,8.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,84,0,86.0,83,260.7,86,98.6,109,8.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,97,24,133.2,135,217.2,58,70.6,79,11.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,151,0,235.9,104,80.6,91,212.8,116,5.8,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,147,0,183.8,113,164.7,110,111.0,87,10.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +1,133,0,254.7,103,252.2,80,178.1,103,8.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,64,27,201.3,101,143.8,89,150.2,127,12.3,3,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,127,0,221.0,100,160.7,113,233.1,96,6.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,27,213.0,121,226.2,101,189.8,99,11.1,3,4,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,38,36,115.4,98,166.2,83,184.7,79,15.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,61,0,267.1,104,180.4,131,230.6,106,17.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,56,0,164.3,92,233.7,107,187.3,104,11.8,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,49,0,266.3,90,207.8,117,205.0,98,14.0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,169.3,90,156.0,138,210.8,106,11.6,6,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,92,29,201.3,130,203.7,115,129.9,113,6.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,98,0,162.8,65,185.0,109,219.5,104,6.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,214.8,112,209.7,104,164.4,97,9.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,176,23,283.2,130,162.6,74,177.7,104,7.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,95,0,58.2,96,202.1,126,210.5,97,10.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,115,0,210.6,120,153.1,84,262.2,79,11.0,5,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,151.8,106,138.0,126,233.5,112,11.2,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,68,29,239.5,82,203.8,105,167.8,70,9.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,97,15,117.6,97,196.3,126,157.4,113,6.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,182,0,279.1,124,180.5,108,217.5,104,9.5,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,39,82.6,113,224.4,63,163.6,88,9.5,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,120,0,131.7,99,163.1,109,201.1,116,10.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,54,0,272.6,83,248.7,74,197.4,111,9.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,57,37,201.2,76,280.1,122,154.2,110,11.8,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,93,32,218.7,117,115.0,61,192.7,85,9.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,1 +0,34,0,124.8,82,282.2,98,311.5,78,10.0,4,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,58,0,165.4,100,115.7,87,193.8,118,12.8,5,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,0,201.9,74,226.8,119,217.5,80,13.7,6,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,55,28,105.3,82,197.4,109,187.5,91,8.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,57,0,154.2,78,196.7,85,253.5,97,10.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,122,0,231.2,141,267.8,136,240.3,100,8.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,106,29,83.6,131,203.9,131,229.5,73,8.1,3,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,75,0,314.6,102,169.8,86,285.1,100,5.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,102,0,102.6,89,246.0,77,170.5,140,9.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,29,213.6,127,175.9,82,207.2,100,8.9,3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,122,0,146.3,117,218.7,93,236.0,97,11.5,5,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,57,17,236.5,94,163.1,94,236.7,117,12.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1 +0,109,46,217.5,123,233.7,84,163.9,99,9.0,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,114,34,154.4,109,221.4,142,208.5,103,10.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,70,30,143.4,72,170.0,92,127.9,68,9.4,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,193,0,170.9,124,132.3,95,112.9,89,11.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,62,0,172.4,132,230.5,100,228.2,109,11.0,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,53,0,57.5,95,265.5,131,244.3,128,11.6,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,31,225.2,89,256.8,117,249.7,87,11.5,1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,93,0,146.3,85,216.6,95,233.0,82,11.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,0,197.0,88,190.4,68,211.9,104,16.1,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,94,0,89.5,94,339.9,106,172.9,76,7.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,164,0,123.3,78,170.0,85,165.9,78,12.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,159,0,87.7,103,278.2,97,170.6,93,10.5,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,58,43,142.8,96,272.3,100,193.4,105,8.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,67,36,115.6,111,237.7,94,169.9,103,9.9,12,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,133,0,143.8,71,184.0,131,275.5,132,12.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,122,33,270.8,96,220.4,110,169.9,104,11.8,8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,93,0,131.4,78,219.7,106,155.7,103,11.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,81,0,261.4,141,215.7,102,271.8,96,8.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,0,154.4,130,217.2,101,185.4,52,13.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,6,0,183.6,117,256.7,72,178.6,79,10.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,147,0,274.0,92,231.8,82,283.6,83,6.2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,100,0,107.0,63,105.7,67,243.1,74,12.8,3,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,154,0,154.5,122,214.2,71,178.0,105,12.0,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,13,25,176.6,65,172.7,96,104.5,128,11.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,0,1 +1,112,0,174.3,123,140.2,124,215.4,89,9.0,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,22,306.2,123,189.7,83,240.3,107,11.7,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,67,0,215.5,102,190.7,95,214.5,106,8.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,148,0,233.5,81,187.7,71,122.3,97,9.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,56,0,177.7,114,215.6,110,236.7,67,10.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,119,0,154.5,129,193.6,87,180.9,145,13.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,67,0,210.7,116,219.2,86,179.7,83,7.2,6,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,69,0,195.1,91,261.5,57,203.8,90,11.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,105,27,141.2,96,167.7,94,274.4,101,11.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,172,47,274.9,102,186.6,118,245.0,123,8.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,68,30,122.9,93,233.5,91,199.5,144,9.6,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,177,0,227.8,81,161.8,97,217.0,106,8.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,148,25,230.7,102,233.8,109,215.8,90,13.5,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,108,0,210.6,117,164.2,103,201.4,68,9.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,161,0,156.1,114,180.3,63,179.6,115,11.1,9,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,63,36,199.0,110,291.3,111,197.6,92,11.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,163,23,224.0,126,233.5,89,293.9,104,8.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,139,25,138.3,96,80.6,79,163.7,83,8.3,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,93,20,187.5,110,169.8,94,175.3,127,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,83,0,208.9,71,214.8,92,247.9,108,13.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,28,0,236.8,102,167.1,87,280.2,115,9.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,162.0,81,247.5,89,155.5,99,8.9,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,161,0,332.9,67,317.8,97,160.6,128,5.4,9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,160.6,103,237.0,109,245.1,88,10.7,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,79,0,205.7,123,214.5,108,226.1,106,6.7,18,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,99,0,155.3,93,265.7,95,145.7,67,12.4,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,127,25,173.0,91,245.8,64,300.0,99,4.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +0,138,0,87.6,112,266.9,107,214.6,104,9.8,10,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,165,0,216.6,126,190.8,104,224.7,123,12.4,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,56,0,222.7,133,277.0,89,101.8,94,13.6,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,30,173.1,107,247.2,101,158.7,104,11.5,5,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,105,0,162.3,99,212.5,95,214.7,114,11.1,8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,0,197.9,84,168.1,113,239.8,145,12.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,52,0,155.0,110,133.4,104,176.1,84,7.0,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,64,0,225.3,134,108.2,87,139.6,132,17.3,9,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,120,0,179.9,72,170.0,98,190.6,89,13.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,56,0,146.1,57,196.2,97,310.1,110,9.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,147,24,219.9,118,208.5,116,352.5,111,8.1,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,172,0,215.7,140,146.3,84,264.6,83,7.1,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,30,30,217.4,74,213.8,86,227.2,104,6.6,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,18,0,273.6,93,114.6,116,250.6,120,8.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,101,42,209.2,82,159.7,74,181.6,100,9.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,184,0,151.7,93,178.5,77,229.1,111,13.1,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,107,0,167.3,100,163.9,79,185.9,100,6.7,5,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,0,146.4,81,225.1,80,230.1,117,8.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,125,0,206.0,128,198.1,71,135.9,116,13.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,91,0,231.8,120,150.6,106,269.2,129,11.6,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,81,28,167.9,147,190.7,105,193.0,103,9.2,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,54,33,112.0,90,208.0,112,150.3,83,11.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,44,0,202.6,89,163.0,96,268.1,151,8.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,0,179.3,93,178.6,98,225.2,131,11.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,42,0,146.3,84,255.9,113,45.0,117,8.0,12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,73,0,122.0,92,138.3,114,224.2,128,5.8,5,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,69,0,195.3,70,216.7,108,259.9,119,12.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,64,0,148.1,73,164.9,101,216.0,125,12.3,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,121,30,198.4,129,75.3,77,181.2,77,5.8,3,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,142,24,239.8,103,285.9,65,256.7,106,9.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1 +0,63,13,214.2,61,181.2,88,174.0,68,10.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +0,117,13,207.6,65,152.7,77,232.8,95,9.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,64,0,236.2,77,218.6,85,194.1,97,13.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +1,92,0,154.0,122,329.8,88,288.0,117,5.6,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,75,38,163.6,132,146.7,113,345.8,115,13.1,3,3,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,113,0,215.9,93,240.1,85,156.7,123,4.9,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,132,36,226.2,103,181.6,125,258.8,102,10.5,5,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,114,36,309.9,90,200.3,89,183.5,105,14.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,81,0,153.5,99,197.6,102,198.5,86,6.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,52,38,169.3,88,225.9,97,172.0,86,8.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,99,0,128.8,86,203.9,105,282.6,131,14.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,103,0,171.7,78,144.5,86,157.9,106,6.8,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,20,35,171.5,98,153.1,127,165.6,125,7.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,63,0,180.5,126,230.0,98,232.5,73,10.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,245.0,112,180.4,91,262.9,105,9.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,74,0,172.1,105,211.7,99,182.2,105,11.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,58,0,247.2,116,303.7,103,105.4,94,9.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,0,190.2,119,157.1,70,181.5,120,14.0,6,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,94,0,252.6,104,169.0,125,170.9,106,11.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,102.7,89,149.3,100,188.1,114,11.0,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,115,0,132.0,90,197.5,75,175.8,114,0.0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,50,24,214.3,129,289.8,55,312.5,130,10.6,4,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,64,37,154.6,92,83.4,103,165.9,99,13.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,123,23,245.0,88,265.0,105,239.7,108,14.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,169,0,100.8,112,230.0,69,193.6,95,9.5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,25,288.5,114,203.4,74,228.4,117,13.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,125,0,168.6,99,175.6,107,243.3,92,10.9,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,50,0,157.1,90,223.3,72,181.4,111,6.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,51.5,90,164.0,98,169.4,80,9.5,4,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,106,0,165.3,118,210.0,101,187.2,93,8.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,113,0,122.2,112,131.7,94,169.5,106,10.3,9,5,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,113,0,150.1,120,200.1,85,266.7,105,11.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,0,157.5,70,130.7,79,193.4,98,9.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,103,0,70.9,134,134.5,112,168.8,164,12.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,166.0,79,74.6,100,247.9,74,6.3,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,82,0,200.3,96,201.2,102,206.1,60,7.1,1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,100,30,58.8,104,219.5,107,152.3,118,7.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,99,0,140.7,88,210.9,98,229.9,125,12.4,4,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,155,0,163.1,94,291.7,108,96.4,111,11.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,100,38,224.7,121,294.0,131,290.0,61,9.8,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +0,66,0,201.3,95,152.8,66,233.2,101,7.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,0,150.7,105,197.3,133,169.0,116,9.2,15,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,161,0,191.9,113,70.9,87,204.8,107,13.4,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,94,0,220.8,111,156.2,67,187.9,89,10.5,4,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,56,0,226.0,112,248.5,118,140.5,142,6.9,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,111,28,128.8,104,157.3,52,147.4,76,10.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,116,0,137.4,126,120.0,94,130.3,64,12.4,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,92,0,196.5,82,190.0,89,163.2,99,10.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,130,0,213.1,105,206.2,108,163.4,93,8.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,39,175.7,93,187.2,94,225.5,118,8.6,3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,82,0,101.0,93,155.6,104,304.4,93,13.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,1,30,183.1,95,232.6,110,248.3,110,8.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,85,0,144.4,88,264.6,105,185.4,94,9.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,144,0,203.5,100,247.6,103,194.3,94,11.9,11,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,74,0,174.1,96,251.1,94,257.6,123,8.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,139,0,134.4,106,211.3,98,193.6,125,10.2,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,103,36,87.2,92,169.3,110,166.7,80,10.9,5,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,53,0,205.1,86,160.5,95,149.5,142,10.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,104,0,156.2,93,193.0,54,222.7,94,13.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,96,31,183.4,126,195.5,106,180.1,93,10.5,5,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,118,0,187.4,97,177.8,89,233.4,97,12.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,101,28,220.3,96,285.8,72,203.0,111,9.4,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,96,0,160.2,117,267.5,67,228.5,68,9.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,42,0,150.7,52,246.7,96,103.8,118,7.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,55.3,102,164.7,124,200.7,108,10.2,5,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,68,0,172.7,95,139.1,90,174.3,99,11.7,1,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,112,0,170.4,103,200.2,71,258.3,100,11.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,115,24,142.1,124,183.4,129,164.8,114,9.6,4,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,157,29,219.2,102,206.0,109,192.4,117,15.0,5,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,69,27,268.8,78,246.6,89,271.9,102,16.4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,45,29,135.8,104,222.5,101,235.6,92,7.9,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,117,0,239.9,84,174.8,106,209.5,93,9.8,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,155,0,190.3,123,301.3,96,214.6,134,8.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,160,0,176.2,90,196.0,115,263.9,95,9.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,122,37,163.0,107,312.8,118,200.0,85,11.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,129,0,101.4,145,249.1,116,157.6,107,7.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,145,0,175.8,89,274.3,119,226.6,69,12.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,23,0,190.2,89,166.4,108,219.8,73,15.0,4,6,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,35,182.8,122,212.7,119,193.8,103,11.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,66,0,146.4,107,196.5,99,230.1,106,7.8,2,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,151,0,170.2,89,187.5,83,119.5,100,4.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,99,0,128.3,78,215.3,120,143.7,140,14.3,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,70,0,59.5,103,257.2,106,208.3,86,11.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,24,25,164.9,110,209.3,105,231.2,55,6.7,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,102,0,233.8,103,221.6,131,146.9,106,12.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,73,0,187.3,118,239.7,90,167.5,108,15.1,2,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,35,0,149.3,113,242.2,122,174.3,104,8.9,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,119,16,147.2,103,160.1,96,184.0,120,7.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,78,0,139.2,140,191.4,113,286.5,125,11.8,3,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,64,43,118.4,100,144.1,108,158.1,91,8.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,105,0,166.1,93,175.9,106,243.5,55,16.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,60,0,135.4,134,205.9,85,204.0,103,7.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,104,23,280.2,136,220.5,92,136.9,102,13.3,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +0,40,0,115.7,105,127.8,113,107.5,91,9.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,132,0,81.1,86,245.2,72,237.0,115,10.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,9,39,214.1,108,169.2,115,189.7,117,10.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,129,34,204.5,79,132.8,113,190.1,117,14.8,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,150,0,136.6,112,209.4,81,161.1,78,12.2,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,181,27,190.3,93,249.0,127,215.7,82,10.6,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,125,0,233.3,65,209.8,93,210.6,109,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,99,0,54.8,92,173.0,103,195.1,125,7.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,112,0,272.5,119,226.1,94,159.1,94,16.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,0,192.0,91,127.6,127,155.6,125,7.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,0,99.5,110,129.1,80,125.1,124,9.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,122,22,204.5,92,139.6,121,205.0,103,8.6,5,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,84,0,146.8,133,171.7,73,234.5,69,9.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,139,0,271.6,130,156.0,131,136.3,108,11.6,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,201,0,212.7,72,225.2,90,195.1,99,7.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,73,0,234.7,102,195.7,110,253.4,71,8.4,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,71,0,243.7,124,60.0,90,189.0,129,11.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,166,0,199.6,93,214.3,99,196.8,110,7.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,165,0,209.4,67,273.8,89,150.2,88,12.8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,99,0,238.4,96,246.5,130,198.4,117,12.4,4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,49,0,119.3,117,215.1,109,178.7,90,11.1,1,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,114,0,84.7,118,249.9,86,193.4,95,14.5,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,66,35,190.8,100,261.3,93,209.5,108,8.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,114,30,206.2,79,260.0,91,291.6,83,11.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,65,0,192.0,89,139.5,88,187.4,102,5.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,120,0,203.3,108,259.9,66,115.9,103,7.8,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,203.7,92,216.4,97,154.2,66,7.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,55,20,207.7,91,199.7,113,216.5,110,7.3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,49,0,213.8,79,265.1,93,239.8,128,15.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,33,0,164.0,99,153.1,102,123.8,104,6.4,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,86,0,148.2,71,285.1,91,166.4,155,6.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,109,0,209.1,141,205.0,93,119.4,111,7.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,107,22,281.1,83,143.7,130,239.4,128,11.2,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,80,0,322.3,113,222.0,95,162.8,123,6.7,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,152,0,206.3,98,292.8,82,43.7,121,10.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,108,34,162.1,83,171.8,117,259.8,76,9.6,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,57,0,272.7,74,224.9,85,178.2,104,10.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,52,0,129.3,80,142.7,101,258.3,89,12.3,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,38,0,175.7,109,211.8,97,137.9,109,9.2,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,123,0,140.0,106,153.7,101,50.1,87,12.5,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,0,103.0,129,242.3,103,170.2,89,7.9,3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,31,0,165.4,84,203.7,107,201.7,65,8.2,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,11,0,143.4,130,289.4,50,194.0,100,9.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,84,0,181.5,129,130.7,112,186.5,118,8.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,0,179.1,123,196.6,132,186.7,116,10.2,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,110,0,188.0,127,90.5,118,150.3,64,15.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,127,27,2.6,113,254.0,102,242.7,156,9.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,126,0,175.4,120,98.3,71,201.9,93,10.6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,74,0,125.8,103,207.7,96,207.4,143,14.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,51,0,180.5,88,134.7,102,170.7,97,10.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,0,217.9,71,230.1,116,232.1,110,10.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,41,0,202.9,97,153.8,104,113.5,92,9.0,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +1,108,0,291.6,99,221.1,93,229.2,110,14.0,9,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,96,0,97.6,98,105.5,118,220.2,105,11.6,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,23,120.5,104,227.8,115,158.5,100,10.2,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,63,0,142.5,92,208.3,102,228.9,120,7.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,108,30,276.6,99,220.1,113,177.9,95,9.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,135,0,194.8,97,235.3,118,174.4,126,11.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,113,0,81.3,116,220.6,124,235.7,113,8.9,3,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,51,0,227.2,89,194.4,106,243.4,126,14.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,130,0,120.5,127,189.7,52,270.1,107,14.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,0,232.4,109,187.4,95,231.2,107,9.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,133,0,127.3,108,251.3,81,135.0,88,10.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,161,0,72.8,120,267.1,120,222.5,91,11.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,221.2,80,213.6,104,291.8,89,11.9,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,63,0,153.5,81,287.3,115,230.2,85,6.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,45,38,196.8,92,254.2,108,261.8,85,7.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,152,0,197.1,126,130.1,76,78.1,100,7.4,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,84,0,130.2,105,278.0,60,305.4,74,14.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,124.1,82,202.6,120,289.6,119,6.7,8,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,154,0,191.4,93,205.4,119,205.7,121,10.2,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,0,176.7,132,244.1,80,176.3,120,9.1,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,52,0,215.9,67,217.0,108,342.8,130,5.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,161,0,196.6,73,170.2,79,194.3,79,12.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,54,0,247.5,85,225.4,93,244.3,132,10.2,2,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,134.0,104,174.5,94,311.1,79,7.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +1,76,0,299.5,125,226.7,92,210.7,134,13.7,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,133,0,227.4,90,73.2,135,114.3,99,4.7,7,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,139,0,196.0,135,186.0,146,153.0,92,9.8,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,88,0,65.4,97,168.2,76,236.0,113,13.8,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,93,0,98.4,78,249.6,129,248.2,114,14.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,68,0,143.6,80,134.3,65,215.6,84,15.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,110,0,208.0,69,95.1,94,178.5,129,8.0,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,112,36,113.7,117,157.5,82,177.6,118,10.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,70,0,148.4,110,267.1,90,151.5,101,8.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,125,34,268.4,112,222.2,108,117.6,102,10.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,131,0,166.5,129,210.2,107,257.2,93,9.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,0,165.0,100,317.2,83,119.2,86,8.3,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,76,26,214.6,110,205.2,87,134.6,140,8.1,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,64,27,182.1,91,169.7,98,164.7,86,10.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,101,0,217.7,118,231.7,128,185.3,128,0.0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,188.8,60,217.4,64,220.1,100,8.2,7,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,119,32,173.0,101,209.4,93,231.1,91,12.2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,124,0,178.4,72,233.6,134,179.4,91,12.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,37,0,221.0,126,204.5,110,118.0,98,6.8,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,108,0,115.1,114,211.3,70,136.1,85,13.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,148,0,17.6,121,161.7,125,203.1,82,10.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,90,0,214.9,97,117.8,117,133.7,78,11.8,2,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,141,0,217.1,110,241.5,111,253.5,103,12.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,179,0,170.7,54,191.1,108,214.6,107,13.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,32,0,171.2,82,185.6,102,203.3,64,10.2,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +1,60,29,265.9,113,215.8,94,108.1,82,14.0,12,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,130,0,132.4,81,200.3,110,202.5,103,6.0,1,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,0,115.9,87,111.3,56,170.2,77,7.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,70,0,129.9,102,208.7,133,231.4,93,14.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,91,27,204.6,96,136.0,93,210.5,82,6.6,2,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,192,0,185.0,88,224.9,98,212.4,105,11.4,3,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,0,153.6,92,205.5,88,114.5,89,12.5,10,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,134,50,208.8,130,132.9,104,136.7,107,11.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,75,0,203.3,70,228.9,97,222.2,118,14.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,16,0,229.6,78,205.7,108,166.2,91,10.8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,61,29,128.2,119,171.7,83,250.9,114,11.7,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,165,0,150.5,75,193.1,93,311.6,93,10.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,119,0,94.2,108,264.1,100,203.7,79,7.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,93,0,239.8,70,251.8,99,168.6,112,10.9,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,162.6,83,152.3,109,57.5,122,14.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,103,0,185.0,117,223.3,94,222.8,91,12.6,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,102,0,174.5,73,213.7,114,164.7,116,10.3,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,119,32,142.6,77,208.2,126,171.0,102,12.0,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,52,20,133.3,63,184.1,123,272.9,107,13.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,89,0,89.7,80,179.8,81,145.7,120,9.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,89,0,326.3,112,165.1,110,162.9,97,7.5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,149,20,198.9,77,274.0,88,190.7,76,14.3,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,167,0,169.2,124,173.3,108,216.5,64,12.4,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,83,0,231.3,100,210.4,84,217.4,106,12.4,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,0,267.4,78,204.2,85,111.7,146,5.9,4,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,107,25,248.6,91,119.3,115,194.3,83,12.0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +0,133,0,221.1,133,160.2,140,161.8,84,8.4,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,99,0,169.2,70,271.5,77,170.2,104,10.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,144,0,133.3,101,255.5,127,228.6,68,11.6,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,12.5,67,256.6,90,169.4,88,7.7,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,107,14,114.3,132,199.8,91,194.7,74,7.5,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,105,0,147.7,103,222.7,78,163.5,102,12.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,161,0,173.4,100,213.7,74,141.5,69,11.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,0,240.2,78,230.3,109,217.0,83,5.2,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,0,232.6,96,253.4,117,154.0,101,10.5,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,46,0,199.2,111,175.1,83,210.6,84,10.2,2,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,134,32,216.8,78,102.2,111,174.0,83,8.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,117,0,143.3,103,211.3,108,185.2,96,11.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,163,40,231.9,56,211.8,91,268.5,74,12.3,3,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,77,0,233.8,104,266.5,94,212.7,104,7.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,109,0,175.6,80,238.0,94,198.4,103,10.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,94,0,271.2,105,202.6,105,221.6,51,11.5,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,87,0,185.8,119,192.3,83,200.0,96,6.6,4,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,33,209.6,68,146.9,140,121.0,131,10.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,125,0,137.1,94,209.8,83,238.4,114,8.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,112,0,111.9,92,114.0,143,146.8,79,14.1,3,5,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,119,0,124.3,68,207.1,88,157.4,93,14.8,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,110,0,271.1,108,237.0,122,239.9,122,9.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,92,0,62.6,111,180.6,126,221.7,80,10.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,144,35,174.8,127,219.6,93,255.8,90,12.8,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,73,0,194.8,112,167.2,85,100.3,61,10.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,54,39,117.6,82,159.2,60,236.4,113,11.3,10,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,77,29,211.1,89,223.5,97,148.4,106,9.7,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,120,28,215.8,123,285.2,76,192.1,78,6.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,0,1 +0,115,0,127.7,67,182.9,90,172.9,92,10.6,7,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,69,0,279.8,90,248.7,91,171.0,118,8.4,10,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,1,0 +0,73,31,194.4,104,176.0,84,230.1,110,11.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,4,0,145.3,89,303.8,93,206.1,82,8.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,90,0,148.2,96,220.4,111,134.2,97,9.2,1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,1,0 +0,91,0,203.1,106,210.1,113,195.6,129,12.0,3,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,120,0,184.5,103,209.0,86,169.7,70,10.2,6,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,181,0,229.9,130,144.4,93,262.4,110,14.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,28,151.1,90,194.8,79,239.2,114,10.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,122,34,146.4,104,89.7,103,220.0,91,15.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,76,0,169.5,77,124.0,87,219.4,92,10.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,139,31,203.5,82,200.3,72,214.0,112,13.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,119,0,138.3,89,170.5,78,263.9,98,13.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,120,0,212.1,131,209.4,104,167.2,96,5.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,46,0,164.2,116,196.2,153,236.1,119,8.1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,29,0,196.8,81,168.0,110,132.6,98,12.7,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,0,215.4,104,204.8,79,278.5,109,12.6,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,122,29,195.4,83,268.2,93,168.0,95,8.4,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,112,0,161.9,138,200.9,114,134.0,134,10.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,1,0 +0,96,0,144.0,102,224.7,73,227.7,91,10.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,75,28,200.6,96,164.1,111,169.6,153,2.5,5,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,107,37,60.0,102,102.2,80,261.8,106,11.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,142,30,154.0,75,165.8,97,270.0,83,10.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,157,0,220.7,105,119.3,127,165.1,113,11.5,7,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,161.0,117,190.9,113,227.7,113,12.1,4,4,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,89,0,197.1,110,165.9,115,227.3,106,12.8,3,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,0,130.2,119,290.9,121,194.8,140,14.0,6,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,86,30,99.9,84,263.5,125,254.7,90,9.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +0,95,0,160.0,133,215.3,98,188.9,87,9.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,149,28,180.7,92,187.8,64,265.5,53,12.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,72,21,186.7,108,335.0,86,187.2,119,16.5,4,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,16,0,161.9,100,230.1,138,148.8,78,10.2,11,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,164,0,192.1,95,249.8,94,132.6,100,7.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,93,0,141.7,95,221.0,100,227.1,71,10.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,155,0,184.6,102,196.0,117,226.5,122,7.8,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,83,0,195.0,92,210.5,83,180.6,92,11.0,13,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,0,204.3,65,247.3,123,214.7,94,12.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,189.7,76,156.1,65,244.0,91,8.3,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,148,0,203.0,92,150.9,125,245.5,131,14.6,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,89,0,89.5,66,179.3,104,225.1,116,12.3,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,101.1,119,214.4,67,179.5,112,10.3,5,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,71,0,211.2,70,252.7,122,225.8,104,12.3,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,47,0,155.3,116,188.2,85,247.0,73,12.3,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,0,222.2,113,218.5,122,266.0,88,10.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,170,37,178.1,130,242.8,103,243.0,93,13.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,163,0,197.0,109,202.6,128,206.4,80,9.1,10,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,140,0,149.7,71,212.5,97,245.9,67,12.6,4,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,168.6,102,298.0,117,194.7,110,9.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,23,182.0,80,216.1,85,156.9,82,9.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,121,0,213.2,79,120.7,116,244.4,102,7.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,96,40,108.6,90,206.4,154,126.3,118,13.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,137,0,104.7,115,249.8,144,192.3,99,8.9,2,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,0,109.1,134,142.3,76,91.2,86,10.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,118,23,289.5,52,166.6,111,119.1,88,9.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,130,30,185.0,117,249.5,141,157.8,103,7.4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,124,0,157.8,71,203.2,114,168.7,82,10.0,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,33,0,190.6,100,161.7,104,189.9,136,13.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,101,0,174.9,105,262.0,75,210.0,93,8.5,5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,118,42,148.7,105,167.3,105,270.6,105,10.4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,89,0,125.6,108,213.0,90,181.7,108,5.4,5,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,131,24,135.9,60,233.2,78,210.6,121,9.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,99,0,182.6,83,154.5,111,196.0,57,12.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,55,0,106.1,77,123.5,100,96.4,92,12.9,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,141,28,308.0,123,247.8,128,152.9,103,7.4,3,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,95,0,194.6,114,232.8,106,173.4,92,3.8,2,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,140,0,235.5,81,257.2,130,103.1,111,11.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,181.5,98,199.9,88,287.7,114,6.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,32,247.0,109,125.6,91,226.5,90,10.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,89,32,209.9,113,249.8,104,224.2,92,8.7,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +1,144,0,278.5,95,240.7,90,120.0,90,11.6,5,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,13,0,143.1,139,239.6,88,221.7,123,7.1,5,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,11,180.7,82,173.7,90,231.5,89,10.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,97,0,169.7,84,165.9,86,191.9,83,12.8,6,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,111,0,146.2,55,261.5,83,163.2,116,8.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,74,0,148.5,111,146.5,42,289.2,83,9.9,6,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,72,0,165.9,114,235.9,97,210.1,120,12.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,24,176.1,109,159.4,81,269.1,94,12.1,9,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,169,0,57.1,98,199.7,78,274.7,103,6.5,6,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,210,31,313.8,87,147.7,103,192.7,97,10.1,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,139,43,231.0,85,222.3,82,148.0,105,8.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,37,161.2,109,204.2,79,231.5,87,8.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,108,0,246.2,102,202.4,134,180.1,95,9.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,114,0,202.1,100,195.7,102,291.8,120,13.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,216.9,78,211.0,115,179.8,116,11.4,5,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,56,24,121.7,87,184.0,76,266.6,98,12.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,111,0,176.4,62,201.0,124,150.4,138,11.2,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,31,0,166.1,105,79.3,93,213.7,98,12.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,105.0,150,251.6,90,258.0,93,14.9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,93,0,152.1,141,215.5,107,262.4,111,12.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +1,151,0,218.0,57,114.4,88,269.2,95,12.4,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,128,0,148.5,105,243.0,106,255.2,114,6.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,90,26,169.0,104,188.8,104,213.3,76,13.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,151,0,134.5,88,143.1,112,223.9,61,15.4,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,45,0,207.6,71,152.7,94,217.8,125,12.4,13,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,0,124.4,83,179.7,81,253.0,99,11.3,6,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,30,0,169.9,144,225.2,118,169.7,93,11.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,0,159.0,109,255.1,142,82.4,73,10.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,138,0,205.9,96,257.1,94,209.0,63,12.1,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,68,24,176.0,118,277.9,116,174.7,71,14.7,7,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,51,0,169.3,111,139.5,69,197.0,87,12.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,67,0,171.7,80,110.4,81,195.4,111,11.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,134,0,204.7,108,143.1,105,165.8,84,11.0,4,6,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,171.8,106,301.7,44,139.4,108,9.7,5,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,117,20,205.7,98,136.1,107,159.4,147,8.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,63,0,149.3,104,273.6,75,206.6,72,9.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +1,178,0,124.5,134,141.2,78,268.2,113,11.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,163,0,178.7,56,215.7,79,152.7,84,10.6,2,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,92,0,139.8,98,174.9,143,201.6,135,9.4,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,116,27,204.7,118,209.4,91,212.9,67,7.0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,189,38,256.7,98,150.5,120,123.0,87,11.4,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,176,0,250.9,108,171.4,100,148.6,85,9.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,190,22,166.5,93,183.0,92,121.0,102,8.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,98,19,110.5,87,227.8,97,243.6,84,11.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,127,0,239.8,107,128.9,121,249.9,110,11.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,138,0,241.8,93,170.5,83,295.3,104,11.8,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,117,0,158.7,84,181.7,91,177.3,67,7.7,10,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,91,0,246.4,110,182.0,98,157.6,106,12.1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,46,0,196.7,85,205.9,74,216.6,112,11.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,85,27,196.4,139,280.9,90,89.3,75,13.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,53,0,261.2,119,250.8,105,176.0,112,9.8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,155,0,250.8,146,152.5,105,148.1,104,10.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,72,0,175.5,103,132.3,120,242.9,96,11.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,61,33,270.7,53,200.7,116,201.7,102,10.9,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,1,26,146.6,68,172.8,67,173.8,113,10.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,105,0,227.4,121,268.5,89,143.3,82,13.0,4,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,133,0,295.0,141,223.6,101,229.4,109,12.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,74,0,314.1,86,222.4,99,259.0,121,12.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,146.3,108,171.8,102,167.5,66,5.3,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,26,0,234.5,109,216.5,129,191.6,94,3.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,150,0,214.0,117,192.4,89,242.6,99,7.9,4,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,225.2,93,215.1,120,241.8,95,9.1,2,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,169,0,179.2,111,175.2,130,228.6,92,9.9,6,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,21,0,244.7,81,168.0,117,281.5,87,6.6,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,180,0,224.9,105,250.0,101,216.1,73,6.7,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,121,0,144.8,126,200.6,82,208.8,81,13.3,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,30,177.3,95,211.8,102,240.2,108,9.3,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,71,0,186.1,114,198.6,140,206.5,80,13.8,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,126,34,244.9,118,219.6,105,210.8,136,9.7,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1 +0,68,0,131.6,89,137.0,109,256.3,107,10.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,122,0,243.8,98,83.9,72,179.8,84,13.7,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,25,194.6,84,119.9,103,175.5,75,13.1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,75,0,138.5,110,153.2,86,215.6,103,11.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,33,29,157.4,99,117.9,80,279.2,79,13.9,11,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,72,0,196.5,88,158.6,129,269.3,118,6.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,53,0,119.7,113,189.7,84,256.2,108,12.9,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,27,140.1,59,223.4,111,257.9,73,3.8,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,93,0,149.6,120,200.7,85,181.2,107,14.3,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,158,0,209.9,112,221.3,82,210.0,93,8.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,115,0,245.0,97,250.7,75,270.2,124,13.7,8,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,137,0,174.0,123,161.3,115,260.7,98,11.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,82,0,167.1,77,131.8,79,187.4,98,9.4,1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,46,34,191.4,102,361.8,96,147.5,132,7.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,88,0,189.8,111,197.3,101,234.5,111,14.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,35,27,241.7,87,142.0,101,288.9,68,9.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +0,54,0,134.3,73,155.5,100,102.1,68,14.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,123,0,257.9,92,211.6,71,189.3,104,9.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,88,0,153.5,94,251.7,118,182.2,99,8.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,87,0,240.0,83,134.1,106,189.1,84,9.3,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,0,274.6,105,161.1,121,194.4,123,9.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,146,0,149.3,83,187.1,130,149.8,100,7.9,4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,56,0,221.9,112,278.2,122,288.1,85,7.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,100,0,218.8,125,148.3,102,277.8,97,9.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,90,0,198.5,124,266.6,100,243.3,80,8.0,7,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,0,168.4,114,276.0,127,196.2,48,11.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,43,0,27.0,117,160.9,97,279.5,96,10.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,124,28,251.4,104,225.1,89,251.9,121,7.5,5,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,90,0,246.4,83,160.3,88,170.9,99,7.6,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,64,0,146.7,83,148.3,91,238.6,69,12.5,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,128,0,227.9,130,302.6,71,191.5,82,5.5,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,140,0,194.8,107,170.9,99,225.1,93,13.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,57,39,213.0,115,191.1,112,182.7,115,9.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,1 +0,99,0,241.1,72,155.6,98,188.2,109,11.6,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,161,0,107.5,121,256.4,46,247.2,131,12.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,22,0,207.7,116,210.6,99,238.2,88,9.6,5,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,70,0,175.4,130,159.5,130,260.6,96,11.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,87,0,189.5,113,204.9,100,221.7,93,13.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,106,0,147.9,97,209.3,99,162.1,80,8.8,5,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,29,157.2,118,196.3,136,226.7,109,8.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,106,0,204.0,84,168.5,61,164.0,102,13.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,57,0,85.9,92,193.9,127,231.5,93,10.1,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,0,252.4,74,167.9,81,248.3,110,10.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,149,0,242.5,83,245.4,97,219.6,80,10.0,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,31,104.9,115,237.6,125,263.4,104,7.7,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,120,0,180.0,80,224.2,82,265.4,91,4.7,7,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,81,0,149.4,68,171.9,98,214.5,97,17.9,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,87,0,167.3,119,198.5,119,133.1,88,11.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,74,31,249.4,70,209.5,59,180.6,75,9.9,2,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,99,0,159.7,83,155.4,121,255.7,114,8.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,144,0,61.6,117,77.1,85,173.0,99,8.2,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,100,0,166.0,102,236.1,97,134.3,93,10.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,141,36,187.5,99,241.4,116,229.5,105,10.5,5,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,32,0,164.8,98,229.9,96,167.3,108,14.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,55,0,191.9,91,256.1,110,203.7,101,14.3,6,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,143,0,194.3,99,123.6,133,229.5,99,10.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,74,0,136.7,106,228.6,105,265.3,114,9.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,40,236.5,111,117.0,110,221.1,115,8.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,89,0,206.9,134,167.7,105,155.7,86,10.9,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,155,0,165.4,108,183.7,103,80.2,108,8.9,4,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,119,19,178.1,110,212.8,100,226.3,123,10.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,100,0,160.3,138,221.3,92,150.4,120,11.2,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,111,0,197.1,117,227.8,128,214.0,101,9.3,11,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,181,40,105.2,61,341.3,79,165.7,97,6.3,3,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,95,36,283.1,112,286.2,86,261.7,129,11.3,3,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,104,0,113.6,87,158.6,98,187.7,87,10.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,70,0,232.1,122,292.3,112,201.2,112,0.0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,120,24,212.7,73,257.5,103,227.8,119,9.7,13,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,88,0,73.3,86,161.4,82,239.6,76,8.2,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,111,0,176.9,128,102.8,56,213.7,84,10.5,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,33,35,161.9,85,151.2,82,191.0,131,8.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,106,0,128.6,83,134.0,114,210.6,113,11.4,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,54,0,190.5,108,259.7,108,141.5,111,9.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,87,0,223.2,109,127.5,86,289.3,83,14.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,94,0,157.9,105,155.0,101,189.6,84,8.0,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,135,41,173.1,85,203.9,107,122.2,78,14.6,15,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,107,0,273.5,104,183.8,68,153.8,67,11.0,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,159,0,275.8,103,189.5,108,223.9,93,7.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,119.2,142,228.4,139,197.9,61,8.4,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,24,174.6,76,176.6,114,214.4,91,8.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,116,0,133.3,94,247.8,126,219.0,78,11.3,5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,115,33,145.0,72,194.5,157,242.3,138,14.2,3,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,103,0,150.6,125,169.1,126,221.2,104,10.4,8,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,95,37,220.2,109,185.3,99,205.1,82,4.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,115,0,109.7,148,223.8,87,240.3,96,15.4,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,143,0,155.4,112,290.9,92,228.4,91,13.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,48,43,172.0,111,200.2,64,233.1,96,8.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,94,0,235.6,131,194.8,107,170.6,93,8.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,1,0 +0,153,31,218.5,130,134.2,103,118.9,105,9.4,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,94,28,92.7,107,127.8,86,225.6,86,9.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,107,0,90.7,90,207.5,109,169.4,96,5.6,5,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,91,37,162.3,107,233.9,115,277.4,94,9.2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,141,0,146.5,121,169.9,125,238.8,112,8.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,58,0,210.1,126,248.9,108,158.6,88,14.4,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,49,28,214.4,78,235.2,100,206.2,107,8.0,13,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,41,34,194.4,63,254.9,110,160.2,115,17.2,9,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,137,0,237.3,103,176.7,84,263.4,81,14.2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,111,0,255.9,97,204.1,129,171.3,84,12.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,71,0,197.9,108,181.5,109,281.4,56,6.7,5,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,43,35,200.2,105,244.4,88,207.2,97,11.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +1,97,0,120.8,96,169.8,101,194.1,63,11.9,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,3,36,118.1,117,221.5,125,103.9,89,11.9,6,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,0,131.8,82,284.3,119,305.5,101,11.3,2,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,29,225.4,79,187.1,112,281.1,112,12.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,87,0,205.2,106,99.5,122,189.5,75,13.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,83,30,272.5,105,253.0,83,180.8,123,8.7,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,67,35,181.1,59,215.9,116,216.3,106,16.9,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,46,0,122.2,67,167.2,62,194.8,98,9.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,33,119.6,104,278.7,88,263.4,175,5.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,90,0,109.6,88,137.6,108,159.7,121,11.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,0,112.7,119,217.7,109,152.1,76,6.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,136.3,97,172.2,108,137.5,101,7.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,141,37,185.4,87,178.5,128,218.3,107,8.0,3,4,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,136,0,199.6,89,211.4,96,72.4,84,11.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,88,0,218.2,76,169.3,60,141.1,99,8.0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,170,0,259.9,68,245.0,122,134.4,121,8.4,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,1,0 +0,44,0,143.2,77,169.8,114,215.8,77,7.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,24,218.2,88,348.5,108,212.6,118,7.5,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,111,0,249.8,109,242.4,106,231.8,78,11.6,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,24,274.7,99,193.5,118,299.6,109,10.8,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,112,0,167.8,88,247.9,81,155.1,108,11.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,73,0,182.3,115,199.2,97,120.2,113,18.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,147,0,157.0,79,103.1,94,211.8,96,7.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,66,0,207.7,85,196.7,112,261.7,83,6.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,136,0,250.2,121,267.1,118,151.0,114,13.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,0,81.9,75,253.8,114,213.1,125,8.9,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,135,0,246.8,129,187.8,121,154.5,109,12.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,102,0,103.1,70,275.0,129,141.1,92,11.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,169,0,147.2,115,161.9,123,142.1,103,7.2,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,60,0,252.7,97,221.1,121,109.9,100,12.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,0,192.2,86,168.6,116,139.8,87,9.4,6,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,83,26,226.4,117,234.7,97,133.6,82,10.8,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,90,0,145.5,92,217.7,114,146.9,123,10.9,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,148,0,178.3,98,282.6,110,181.0,98,11.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,29,133.1,114,221.2,82,131.6,103,6.8,3,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,152,20,214.6,108,96.6,82,170.7,145,7.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,1 +0,136,33,203.9,106,187.6,99,101.7,107,10.5,6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,112,0,185.4,114,191.4,119,144.0,78,10.0,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,122,0,140.0,101,196.4,77,120.1,133,9.7,4,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,44,0,240.3,146,164.6,83,240.7,106,10.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,122,23,134.2,85,227.3,132,122.4,96,8.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,89,0,141.1,92,249.1,126,136.0,73,10.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,176,0,47.4,125,167.8,90,163.1,107,10.5,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,64,22,200.4,80,131.1,84,230.7,67,7.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,112,0,167.6,100,154.5,90,281.4,107,17.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,133,32,221.1,137,264.9,99,168.9,108,15.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,52,0,165.5,78,205.5,89,213.6,124,12.2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,34,175.3,96,262.3,122,143.9,76,5.6,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,127,0,146.7,91,203.5,78,203.4,110,13.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,153,22,167.7,104,246.8,91,203.9,117,7.5,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,117,0,184.5,97,351.6,80,215.8,90,8.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,163,0,202.9,100,178.6,46,203.8,116,12.8,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,76,0,273.3,66,263.6,121,165.2,84,12.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,80,0,194.8,116,209.9,93,194.1,100,12.8,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,27,187.7,84,221.0,147,145.7,110,10.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,91,0,133.7,75,195.3,87,280.5,89,5.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,143,0,209.1,127,106.1,80,179.6,90,14.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,125,29,260.8,81,163.7,112,271.7,117,17.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,126,0,211.6,70,216.9,80,153.5,60,7.8,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,0,156.8,93,215.8,68,223.3,77,7.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,119,0,109.2,96,153.1,80,240.0,102,9.8,5,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,13,0,303.2,133,170.5,86,227.6,80,11.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,138,0,240.8,104,144.5,92,125.7,98,11.6,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,159,0,167.4,68,143.8,74,140.1,111,10.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,111,0,110.4,103,137.3,102,189.6,105,7.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,46,0,90.4,108,276.2,77,146.5,111,12.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +1,68,0,162.1,86,155.0,86,189.7,87,11.0,9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,212.1,95,150.1,88,219.8,111,7.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,70,0,214.8,87,131.0,114,216.9,104,9.4,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,215,0,83.6,148,120.9,91,226.6,110,10.7,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,22,23,182.1,94,164.6,59,128.8,102,12.7,4,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,122,0,170.5,94,173.7,109,248.6,75,11.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,28,198.2,107,139.1,123,199.1,139,8.8,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,75,0,143.2,92,209.1,142,173.0,96,11.9,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,184.5,81,172.0,103,183.4,96,13.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,148,0,185.2,87,170.4,96,165.1,104,9.5,13,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,0,156.5,102,140.2,134,227.4,111,12.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,182,0,69.1,114,230.3,109,256.7,96,6.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,139,0,139.0,110,132.9,93,272.0,120,12.1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,0,101.4,48,159.1,119,259.2,53,12.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,166,0,274.3,110,52.9,109,246.1,119,10.9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,60,0,220.6,57,211.1,115,249.0,129,6.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,76,0,107.3,140,238.2,133,271.8,116,10.0,3,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,28,0,121.7,48,125.8,112,261.6,122,8.3,2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,163.5,136,143.7,111,253.4,82,12.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,146,19,176.6,88,162.7,66,215.5,98,14.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,101,16,118.9,112,228.3,97,180.1,111,8.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,132,0,240.1,115,180.4,91,133.4,122,8.0,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,93,0,179.3,93,188.8,65,253.2,88,12.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,0,246.4,83,256.2,101,169.0,151,3.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,38,177.1,88,163.7,108,242.7,72,7.4,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,134,0,258.8,85,129.5,114,193.6,106,10.9,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,63,0,211.8,84,230.9,137,217.1,99,10.7,9,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,126,0,226.2,88,140.3,114,208.9,110,6.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,166,0,220.7,106,177.8,118,206.1,102,12.4,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,160,0,166.4,117,317.0,129,160.4,121,10.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,162,0,115.1,89,196.8,111,212.4,98,11.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,70,0,213.4,86,204.7,77,256.6,101,5.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,19,155.7,104,185.4,118,192.7,116,8.2,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,75,46,214.1,62,200.9,111,246.8,126,9.2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,74,0,200.4,87,309.2,105,152.1,118,10.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,115,16,133.3,110,185.7,111,161.5,113,5.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,42,0,155.4,127,164.1,45,157.7,128,9.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,0,195.1,100,148.8,95,224.5,117,6.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,171,0,189.8,122,173.7,85,257.1,84,10.3,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,135,22,197.1,113,259.4,95,134.7,135,14.6,5,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,99,0,217.2,112,246.7,89,226.1,89,15.8,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,27,0,236.7,110,231.9,92,164.7,85,12.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,0,192.8,104,234.4,96,203.2,101,13.0,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,76,0,224.4,121,147.9,97,183.8,74,6.7,2,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,114.4,91,216.6,123,250.6,102,11.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,54,39,206.9,143,127.8,72,199.2,120,9.2,1,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,70,0,134.7,96,235.9,90,260.2,113,7.6,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,163,25,219.6,99,210.4,99,242.7,88,13.8,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,96,33,183.3,115,201.4,87,177.4,84,10.4,15,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,62,0,281.0,66,160.6,108,77.9,74,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,115,0,147.9,109,228.4,117,299.7,90,9.6,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,97,0,144.2,91,226.7,137,144.6,72,13.8,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,137,19,175.3,96,241.3,146,211.4,109,7.8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,82,0,199.3,112,193.4,120,254.4,117,7.0,10,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,36,294.9,106,165.7,115,189.2,63,9.8,5,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,64,0,97.2,80,186.2,90,189.0,92,10.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,186,26,74.3,107,177.3,116,296.3,90,14.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,117,0,161.6,104,196.3,119,294.8,111,13.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,117,0,181.5,95,205.1,88,204.0,82,14.7,9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,35,118.0,103,167.2,106,205.7,102,11.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,164,30,238.8,100,230.0,121,206.3,66,13.2,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,103,0,246.5,47,195.5,84,200.5,96,11.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,137,50,186.5,94,178.0,106,215.6,100,12.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,97,28,202.3,97,69.2,84,257.6,64,6.7,3,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,144,0,201.1,99,303.5,74,224.0,119,13.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,96,26,145.8,108,192.2,89,165.1,96,9.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,183,0,116.7,92,213.8,112,214.3,112,9.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,42,0,303.9,106,232.2,54,147.1,76,5.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,100,0,107.2,98,86.8,122,156.2,117,9.7,4,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,131,0,211.8,115,260.5,102,144.2,96,10.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,0,113.7,67,165.1,127,141.5,142,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,0,149.0,115,245.3,105,260.0,94,8.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,104,0,118.5,92,177.8,109,255.7,98,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,63,49,214.9,86,198.2,89,170.8,139,8.2,5,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,159,15,113.9,102,145.3,146,195.2,137,11.8,9,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,147,0,225.2,111,184.9,98,143.2,146,9.9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,1,0 +0,123,0,172.2,92,162.6,76,250.3,101,8.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,100,27,221.7,100,236.1,70,192.7,91,8.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,105,0,150.0,106,293.8,123,250.7,65,10.3,7,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,163,23,160.0,104,189.4,64,229.9,118,10.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,90,0,142.4,126,126.2,118,274.2,71,4.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,125,0,182.3,64,139.8,121,171.6,96,11.6,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,219.6,126,303.3,100,154.5,65,9.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,113,0,92.6,85,177.6,92,159.8,72,14.4,4,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,21,238.0,88,209.6,84,233.0,95,10.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,123,0,224.0,99,210.7,80,231.9,75,2.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,212,0,226.0,127,304.6,83,181.2,132,12.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,73,0,224.4,90,159.5,88,192.8,74,13.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,44,0,204.6,117,205.2,94,164.6,84,10.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,96,26,175.8,96,206.6,84,178.0,105,11.1,2,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,74,33,193.7,91,246.1,96,138.0,92,14.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,77,0,169.4,102,184.9,144,234.3,89,2.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,0,252.1,110,226.1,103,155.6,83,13.8,3,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,122,0,173.6,110,91.7,84,211.7,103,9.7,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,169.1,105,169.9,102,244.9,106,9.9,10,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,52,24,170.9,71,201.4,80,159.0,124,4.1,5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,48,36,230.9,92,167.6,121,270.0,87,7.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,61,40,105.0,78,180.6,100,174.1,115,10.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,141,0,215.6,113,200.6,81,153.8,107,12.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,170,0,285.7,44,167.5,144,260.0,97,8.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,17,35,198.5,123,270.6,74,209.9,130,8.1,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,162,0,220.6,117,155.2,121,186.7,89,10.5,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,85,0,96.7,97,193.8,95,171.7,88,9.7,3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,160,0,97.5,113,268.1,69,255.3,62,13.2,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,29,37,235.0,101,183.3,79,139.8,106,5.7,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,91,0,109.8,100,189.6,104,206.7,85,11.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,96,0,197.7,68,250.5,53,181.2,67,10.5,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,104,0,264.0,108,132.2,75,177.7,91,10.6,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,27,129.5,106,248.9,90,268.0,115,11.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,84,0,159.0,80,167.9,128,167.6,101,12.3,5,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,157,0,276.2,95,165.8,119,151.6,79,2.2,4,3,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,165,0,207.7,109,164.8,94,54.5,91,7.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,57,30,234.5,130,195.2,116,268.8,94,11.4,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,95,0,167.6,96,176.0,89,250.9,113,13.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,28,276.7,121,203.7,99,246.2,88,8.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,97,0,146.0,121,203.0,141,151.8,120,13.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,13,0,220.4,100,211.2,79,259.3,112,13.6,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,50,0,131.7,108,216.5,103,196.1,126,11.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +1,46,0,250.3,100,260.6,90,195.0,104,13.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,35,68.7,95,209.2,69,197.4,42,11.4,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,68,0,207.6,68,251.6,123,191.6,100,10.9,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,72,0,118.2,106,167.2,136,214.2,106,12.2,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,82,0,208.8,101,213.7,87,175.1,86,12.4,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,38,0,137.8,86,286.3,76,167.0,77,14.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,41,0,209.9,105,121.9,105,253.7,104,9.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,96,0,179.5,125,162.3,139,264.5,133,6.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,129,0,216.0,85,186.9,114,210.7,109,4.9,10,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,31,28,210.5,101,250.5,86,241.6,125,11.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,122,0,234.1,101,200.2,121,237.4,89,13.1,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,0,181.5,108,196.9,87,187.2,119,10.3,2,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,109,0,222.5,74,169.7,75,264.3,94,9.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,161,38,240.4,112,201.8,102,206.1,112,16.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,72,0,109.1,97,115.7,96,295.8,84,8.3,6,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,29,158.1,104,322.2,81,210.0,96,8.9,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,129,31,193.0,99,224.8,87,197.6,91,10.3,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,137,0,205.9,88,209.3,86,289.9,84,14.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,48,34,198.0,70,273.7,121,217.9,71,7.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,134,0,244.1,99,246.9,111,200.0,133,7.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,125,0,240.7,82,269.4,85,187.1,74,10.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,153,0,122.5,145,273.3,103,197.8,71,8.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,103,24,111.8,85,239.6,102,268.3,81,6.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,45,0,155.7,110,260.3,103,192.2,98,11.0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,80,38,236.6,69,197.5,68,209.5,102,9.5,10,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,57,0,149.3,100,200.2,110,231.7,101,11.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,94,0,181.3,135,182.4,108,180.6,103,6.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,0,151.8,98,209.9,92,266.9,86,11.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,72,0,139.9,117,223.6,96,240.8,93,12.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +1,62,0,248.7,109,220.0,118,265.7,78,13.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,155,30,61.6,103,255.1,110,225.9,96,12.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,96,21,247.6,95,256.3,150,158.6,72,10.8,6,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,77,0,185.9,95,212.0,98,282.3,81,11.3,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,58,30,178.1,111,236.7,109,264.0,118,8.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,134,32,80.3,94,199.9,124,170.8,117,16.6,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,24,0,235.6,132,115.9,129,185.4,136,16.2,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,172.4,114,256.6,69,235.3,104,0.0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,89,0,178.7,81,233.7,74,131.9,120,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,138,17,225.2,116,173.4,88,145.8,99,11.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +1,61,27,187.5,124,146.6,103,225.7,129,6.4,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,123,0,204.4,88,137.5,111,226.0,100,10.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,134.2,80,165.0,71,173.1,102,10.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,74,0,262.3,114,198.9,96,165.9,90,6.6,5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,37,0,191.1,69,129.2,113,207.5,117,12.9,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,105,31,109.6,108,249.3,119,321.2,101,8.3,4,4,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,56,0,197.0,110,222.8,102,225.3,91,10.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,64,31,228.6,88,248.5,109,167.1,124,9.0,1,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,202,0,115.4,137,178.7,70,185.7,113,6.0,3,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,91,0,147.2,121,175.2,87,136.3,80,13.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,120,0,198.8,56,230.1,73,119.8,81,9.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,89,0,129.2,71,214.1,68,214.9,100,10.3,4,5,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,95,0,238.1,65,187.2,98,190.0,115,11.8,4,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,0,208.0,125,198.9,76,76.4,97,8.6,6,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,45,0,211.3,87,165.7,97,265.9,72,13.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,194.8,133,213.4,73,190.8,92,11.5,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +1,125,0,143.2,80,88.1,94,233.2,135,8.8,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,129,0,143.7,114,297.8,98,212.6,86,11.4,8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,159,0,198.8,107,195.5,91,213.3,120,16.5,7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,99,33,179.1,93,238.3,102,165.7,96,10.6,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 diff --git a/sagemaker_model_monitor/introduction/test_data/training-dataset-without-header.csv b/sagemaker_model_monitor/introduction/test_data/training-dataset-without-header.csv new file mode 100644 index 0000000000..a1bea1dc61 --- /dev/null +++ b/sagemaker_model_monitor/introduction/test_data/training-dataset-without-header.csv @@ -0,0 +1,2333 @@ +0,106,0,274.4,120,198.6,82,160.8,62,6.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,28,0,187.8,94,248.6,86,208.8,124,10.6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +1,148,0,279.3,104,201.6,87,280.8,99,7.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,0,191.9,107,206.9,127,272.0,88,12.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,92,29,155.4,110,188.5,104,254.9,118,8.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,131,25,192.7,85,225.9,105,254.2,59,10.9,6,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,69,0,143.6,88,141.8,86,194.0,83,10.8,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,0,114.3,100,221.1,103,126.3,88,10.9,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,90,0,193.7,83,154.2,79,299.0,60,12.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,60,0,125.1,99,248.8,62,211.3,79,11.2,3,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,0,203.2,81,152.5,99,197.8,76,9.7,3,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,75,39,198.2,107,280.4,132,129.6,73,11.3,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,54,0,273.8,113,119.6,156,267.6,117,11.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,0,190.5,128,205.5,103,130.7,63,13.8,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,189,30,155.2,116,195.5,50,170.1,108,15.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,84,0,203.4,125,182.9,88,213.7,121,13.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,87,36,171.2,138,185.8,102,227.6,97,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +1,60,0,289.8,101,255.6,115,242.8,76,11.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,40,0,81.7,123,210.2,108,212.0,64,11.3,3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,104,0,139.7,78,202.6,119,203.6,102,11.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,99,39,126.8,94,293.6,115,174.1,91,8.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,124,0,167.4,119,233.2,143,109.6,115,10.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,127,0,176.9,110,167.9,100,182.2,138,7.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,33,0,251.9,81,194.6,96,211.2,87,8.4,3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,113,0,159.8,143,210.1,93,175.1,86,13.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,39,0,187.2,110,114.7,116,104.7,83,13.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,100,0,113.3,96,197.9,89,284.5,93,11.7,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,206.3,97,154.9,98,263.6,82,12.4,12,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,127,28,235.6,124,236.8,113,241.2,127,7.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,69,0,228.2,70,263.7,80,142.6,60,10.7,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,116,0,288.0,120,255.8,90,233.4,99,13.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,145,0,187.9,110,197.0,117,167.0,108,4.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,190.4,91,92.0,107,224.8,108,13.6,17,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,145,0,245.8,116,286.7,91,240.7,115,9.0,13,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,40,31,224.7,69,134.5,81,120.3,104,7.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,43,0,241.9,101,129.4,121,264.8,104,5.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,147,0,168.6,92,187.7,107,216.5,95,14.4,8,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,68,0,232.4,76,153.3,103,214.6,107,10.5,2,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,22,124.5,94,231.7,90,222.2,108,6.4,12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,110,0,242.5,110,162.3,140,184.1,86,7.8,3,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,57,30,179.2,105,283.2,83,228.1,77,14.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,137,0,242.1,118,191.0,93,218.6,50,14.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,89,0,303.9,95,260.9,114,312.1,89,5.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,71,0,238.0,82,278.5,94,193.1,134,11.8,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,166,0,203.4,81,167.7,110,132.0,124,9.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,125,0,298.4,78,270.5,142,107.3,84,12.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,63,0,207.6,96,229.0,112,162.6,131,13.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,83,0,159.3,104,202.3,98,229.0,73,9.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,31,28,171.8,116,240.7,125,245.5,80,10.6,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,145,0,129.4,97,185.4,101,204.7,106,1.1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,146,0,205.4,101,134.9,77,310.5,83,10.3,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,12,0,216.7,117,116.5,126,220.0,110,9.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,136,0,204.5,63,208.8,95,224.0,119,9.8,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,110,0,185.1,100,165.1,88,111.6,104,6.3,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,138,0,146.5,101,284.5,142,176.0,98,14.0,6,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,107,28,201.8,79,304.9,128,225.6,133,11.9,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,1 +0,101,23,262.2,101,157.0,80,129.1,100,7.3,14,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,91,0,145.0,89,175.8,102,223.7,151,16.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,96,27,261.3,96,220.9,101,179.4,97,11.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,136,0,256.8,90,230.1,104,143.6,82,9.1,10,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,111,0,142.3,75,122.8,106,229.5,94,12.8,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,39,0,295.4,126,232.1,117,204.4,123,11.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,21,147.0,112,197.3,43,267.4,93,8.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,92,47,141.6,95,207.9,130,203.6,95,10.2,11,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,95,0,141.1,84,211.4,108,103.7,127,5.9,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,151,0,194.8,106,292.7,103,224.6,82,5.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,0,178.3,137,189.0,76,129.1,102,14.6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,0,198.4,91,264.7,106,111.4,101,9.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,0,193.6,58,148.7,115,282.5,105,13.1,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,0,245.9,73,240.1,87,158.7,89,8.9,5,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,0,115.6,77,213.6,100,218.4,72,10.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,34,0,151.0,102,131.4,101,186.6,86,9.9,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,72,0,287.4,116,235.3,126,292.1,114,5.0,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,104,0,164.2,109,155.4,90,168.9,117,10.7,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,150.5,92,120.3,95,271.2,96,9.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,160,0,171.2,103,243.5,121,178.2,92,13.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,0,54.7,131,256.1,105,176.6,135,11.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,1,0,175.2,74,151.7,79,230.5,109,5.3,3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,124,0,193.0,97,89.8,99,172.8,104,15.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,106,0,223.0,121,110.1,98,188.7,107,7.1,12,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,0,151.8,115,103.6,116,156.3,86,12.2,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,150,28,174.4,75,169.9,80,201.6,130,11.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,27,0,227.4,67,248.0,115,61.4,109,7.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,93.8,127,150.0,104,241.1,116,10.7,2,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,82,0,207.0,90,232.9,83,172.4,108,9.1,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,78,0,140.7,77,195.2,114,252.9,107,11.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,180,0,143.3,134,180.5,113,184.2,87,10.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,24,0,265.6,86,208.8,102,182.5,105,11.1,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,128,40,171.2,88,145.7,109,196.8,93,14.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,75,0,209.4,133,211.5,121,291.2,123,7.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,9,16,88.5,87,178.8,108,228.7,96,11.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,166,28,175.8,126,253.6,76,128.5,72,11.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,80,30,184.2,132,167.5,109,212.8,114,10.0,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,136,0,102.1,75,219.5,97,73.7,92,9.8,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,33,125.0,99,235.3,81,215.3,95,10.2,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,71,22,141.4,107,163.0,105,220.0,99,5.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,146,0,169.5,93,230.9,71,269.8,115,9.0,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,99,19,87.7,103,223.0,86,182.3,112,7.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,1 +0,80,0,202.4,118,260.2,67,177.4,112,9.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,140,0,173.2,91,196.8,106,209.3,128,11.2,5,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,13,21,315.6,105,208.9,71,260.1,123,12.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,82,34,232.6,121,153.2,115,286.7,77,4.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,123,39,270.4,99,245.1,110,108.9,113,15.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,113,23,205.0,101,152.0,60,158.6,59,10.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,42,0,241.2,134,116.5,114,152.2,91,10.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,86,0,83.8,121,240.2,96,158.6,108,6.7,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,103,0,166.6,84,192.4,91,167.9,115,7.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,0,191.3,134,261.5,113,182.3,111,10.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,47,0,47.8,120,178.9,123,152.6,96,13.3,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,101,0,124.8,66,257.2,85,193.2,115,13.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,24,118.1,83,109.6,72,245.5,73,16.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,87,0,124.3,91,173.4,105,256.3,109,7.5,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,100,0,203.1,96,217.0,126,180.9,122,13.5,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,122,0,136.7,115,243.1,137,188.9,110,8.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,144,51,283.9,98,192.0,109,196.3,85,10.0,4,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,132,0,206.2,100,211.2,118,196.2,122,10.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,140,0,159.1,104,269.8,106,220.4,116,10.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,31,107.7,124,188.9,104,196.2,98,8.9,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,90,0,113.2,108,189.3,63,271.8,124,14.1,4,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,136,0,259.4,99,172.7,125,293.7,78,10.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,109,0,170.7,101,240.2,82,119.0,112,11.4,4,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,134,0,202.7,105,224.9,90,253.9,108,12.1,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,106,0,235.2,121,220.6,87,236.3,91,11.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,68,0,226.7,94,168.4,129,188.7,117,10.2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,159,19,184.1,78,194.5,71,225.6,101,16.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,140,0,231.9,101,160.1,94,110.4,98,14.3,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,0,149.9,95,256.1,110,212.7,92,13.3,13,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,65,0,105.7,95,141.8,100,180.5,105,6.6,12,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,52,0,251.4,118,196.6,80,192.0,53,11.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,93,21,117.9,131,164.5,115,217.0,86,9.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,75,41,130.9,115,203.4,110,171.7,68,12.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,0,1 +0,72,0,253.0,73,219.3,78,210.8,89,9.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,0,251.6,88,175.1,103,184.4,112,5.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,140.1,132,209.6,126,264.1,77,8.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,57,0,158.1,117,115.2,149,182.4,92,11.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,21,197.9,99,165.6,100,208.0,120,10.1,9,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,98,21,64.6,98,176.1,86,244.8,84,0.0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,71,39,183.2,103,209.4,111,172.4,109,11.9,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,156,0,178.8,94,178.4,97,169.2,77,7.5,3,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,0,167.8,121,212.9,123,208.2,73,13.0,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,165,0,156.0,88,276.1,81,175.9,94,9.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,73,0,175.4,130,248.1,105,122.4,85,12.2,4,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,14,80.2,81,219.0,103,122.6,102,8.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1 +0,121,0,177.2,142,123.5,88,213.2,51,8.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,55,20,189.3,95,118.6,113,250.2,102,12.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,88,31,181.6,91,213.2,120,207.8,104,11.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,84,42,165.3,97,223.5,118,260.8,72,7.6,7,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,117,0,144.6,115,258.8,66,253.2,113,7.4,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,0,211.3,120,162.6,122,134.7,118,13.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,99.4,62,275.0,86,212.1,94,16.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,149,18,148.5,106,114.5,106,178.3,98,6.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,140,0,125.3,84,167.6,121,260.6,94,8.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,82,0,189.2,81,184.4,117,255.8,83,10.6,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,55,0,189.0,100,118.5,99,248.1,87,17.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,41,41,207.3,95,137.3,120,115.7,74,5.9,3,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,190,0,182.2,101,212.3,95,233.0,123,9.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,55,0,132.0,103,279.6,114,180.0,74,13.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,134,0,242.4,126,152.9,115,318.3,115,11.8,6,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,45,0,78.6,106,187.3,110,184.2,111,7.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,70,0,126.3,99,141.6,106,255.9,96,9.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,135,0,190.9,44,161.4,109,231.9,100,8.4,2,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,28,95.9,117,159.5,131,152.8,132,10.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1 +0,138,0,220.2,89,88.3,125,195.3,79,12.9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,0,257.3,84,184.8,115,108.9,109,13.5,7,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,121,0,190.7,103,183.5,117,220.8,103,9.8,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,126,26,129.3,123,176.5,114,154.5,102,9.6,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,95,0,149.2,96,260.7,116,201.0,120,8.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,143,0,119.1,117,287.7,136,223.0,100,12.2,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,76,0,143.7,55,173.1,108,239.1,95,5.8,6,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,0,147.1,80,199.7,100,160.7,106,13.7,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,75,0,153.2,78,210.8,99,153.5,100,7.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,116,0,146.4,123,176.6,113,212.6,102,7.8,5,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,197,0,127.3,80,222.3,115,173.9,95,13.7,5,5,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,91,31,273.0,78,215.5,98,104.7,114,9.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,115,0,345.3,81,203.4,106,217.5,107,11.8,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,16,0,144.8,84,164.9,141,231.5,75,8.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,103,0,173.5,83,244.3,65,221.6,66,9.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,35,245.4,89,148.2,102,274.0,136,7.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,51,0,153.6,108,232.9,85,214.2,92,14.1,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,62,0,182.3,101,328.2,93,245.0,131,11.2,1,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,90,0,102.0,118,113.3,134,188.6,105,11.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,95,41,136.8,91,200.8,61,133.7,67,10.3,9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1 +0,116,0,167.8,119,142.0,123,190.7,128,7.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,138,0,286.2,61,187.2,60,146.2,114,11.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,82,29,207.2,111,254.1,137,169.3,92,9.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,78,13,281.2,93,178.2,101,244.2,129,6.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,83,0,134.8,96,167.2,78,161.5,123,7.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,108,0,187.4,101,199.9,126,216.1,107,12.6,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,212,0,126.0,96,144.3,80,302.8,102,7.6,3,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,71,0,207.0,112,173.8,96,178.4,61,12.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,32,116.9,120,232.4,97,127.7,112,11.0,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,48,0,275.2,67,180.2,108,159.0,110,7.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,79,32,50.6,62,201.4,87,146.8,121,4.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,173,0,172.5,78,142.6,91,102.0,63,10.9,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,118,0,154.8,71,244.0,73,159.6,81,12.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,145,30,175.3,107,153.3,116,233.6,85,11.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +1,136,0,269.8,106,228.8,101,257.5,106,10.1,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,33,241.7,84,165.8,84,160.6,80,11.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,112,0,174.5,127,259.3,71,170.5,120,11.3,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,132,0,117.6,66,214.0,108,239.5,94,8.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,82,0,135.4,102,237.1,122,118.3,91,17.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,204.5,108,162.4,110,155.0,102,13.4,1,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,114,31,174.5,104,224.2,92,116.3,91,12.3,10,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1 +1,37,0,239.9,120,261.6,88,207.1,88,8.9,4,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,107,32,134.2,101,211.9,145,167.6,138,8.2,5,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,52,32,130.1,68,247.2,77,289.4,87,13.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,52,21,195.7,119,106.2,95,157.4,94,5.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,77,0,67.7,68,195.7,86,236.5,137,12.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,177,0,232.8,106,175.2,97,212.2,77,12.5,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,113.0,80,150.1,87,204.3,115,10.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,158,0,158.0,106,292.5,114,241.1,89,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,34,138.8,80,142.0,108,183.8,77,11.8,7,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,65,0,245.7,139,241.9,113,285.3,117,4.2,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,24,154.8,69,177.2,105,207.6,102,9.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,136,0,152.6,97,208.9,85,119.1,99,5.0,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,110,0,18.9,92,258.4,81,109.6,74,14.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,89,0,213.0,63,176.6,71,262.6,126,9.1,1,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,194.4,101,190.3,82,183.4,107,11.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,154,35,64.9,76,184.1,91,151.6,75,14.6,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,0,1 +0,36,0,253.4,77,182.4,151,275.8,103,8.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,183,8,86.5,119,285.2,97,180.4,133,8.7,2,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,113,20,157.8,83,161.5,56,271.5,100,8.7,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,137,0,110.5,79,223.2,111,169.5,64,10.5,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,61,0,45.0,108,151.3,74,152.9,94,9.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,124,0,151.0,98,120.6,119,152.8,81,9.2,2,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,87,0,322.5,106,204.6,93,186.2,128,9.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,144,33,251.6,87,197.6,118,209.2,97,12.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,157,0,168.6,71,205.1,48,175.8,88,5.9,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,86,0,217.8,93,214.7,95,228.7,70,11.3,7,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,75,0,166.3,125,158.2,86,256.7,80,6.1,5,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,105,0,226.9,106,182.2,77,203.9,107,11.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,118.6,89,199.6,97,53.3,61,11.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,0,157.3,116,197.5,77,128.2,111,8.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,111,0,229.4,107,214.1,99,289.6,95,10.4,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,189.3,119,233.5,112,270.9,104,10.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,136,0,92.0,117,253.6,77,214.1,90,10.3,10,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,67,0,179.8,125,173.2,86,272.8,97,10.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,137,0,144.0,90,181.6,100,128.1,93,12.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,70,24,249.5,101,259.7,98,222.7,68,9.8,4,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,148,26,244.9,150,118.0,138,236.0,91,15.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,88,0,148.2,82,308.7,67,235.4,79,6.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,80,0,276.5,122,195.6,79,210.3,78,7.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,82,33,137.8,95,235.5,128,268.1,70,11.0,6,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,104,0,170.6,97,162.1,111,210.7,131,6.1,1,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,94,0,85.9,113,226.7,91,279.6,110,15.6,16,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,66,0,170.5,103,254.3,77,197.3,138,10.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,21,214.0,113,180.0,114,134.5,82,10.6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,88,0,85.7,112,221.6,70,190.6,75,11.6,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,0,198.4,121,249.5,104,162.8,115,10.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,1,0 +0,80,0,199.8,138,167.1,91,271.8,94,5.5,4,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,35,0,158.6,67,130.4,96,229.8,80,6.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,115,0,184.1,98,327.0,73,212.5,106,7.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,34,208.8,119,142.1,106,214.6,87,12.5,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,95,0,134.4,104,152.4,95,236.5,80,9.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,153,0,185.3,127,208.0,73,206.1,124,15.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,0,157.6,92,198.3,87,364.9,106,9.1,4,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,89,0,82.3,77,167.2,80,194.7,70,7.2,4,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,149,0,175.4,80,197.4,127,188.2,102,9.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,100,0,70.8,94,215.6,102,230.8,125,9.5,1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,172,0,172.5,85,253.1,71,221.6,113,5.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,0,152.5,131,252.4,107,185.4,104,4.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,36,25,152.8,110,242.8,67,147.4,74,9.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,81,0,227.4,105,211.5,120,258.2,113,11.9,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,79,34,103.7,100,236.3,78,256.6,102,14.8,4,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,139,0,211.1,103,206.9,108,193.9,70,5.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,22,196.0,82,322.7,82,225.6,120,3.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,102,0,158.0,94,207.9,100,190.4,120,10.1,10,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,88,0,235.1,98,251.8,79,285.9,76,7.2,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,176,0,201.9,101,154.7,78,164.4,79,9.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,105,0,281.3,124,301.5,96,202.8,109,8.7,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,160,0,256.0,111,187.4,61,119.1,81,11.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,48,0,188.4,63,165.9,89,205.7,71,13.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,95,0,135.0,99,183.6,106,245.3,102,12.5,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,99,0,124.6,90,146.4,70,169.4,95,10.5,6,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,229.3,103,177.4,126,189.3,95,12.0,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,94,0,212.1,98,189.4,89,352.2,95,8.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,40,210.0,116,232.7,89,168.8,94,5.9,4,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,70,0,232.8,95,303.4,111,255.6,104,12.9,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,28,0,180.8,109,288.8,58,191.9,91,14.1,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,93,42,166.9,101,273.2,84,171.0,106,11.5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,1 +0,138,0,251.0,119,91.2,96,142.2,87,13.8,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,206.9,115,224.4,86,197.4,60,8.3,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,98,0,227.1,116,120.5,103,117.0,102,4.7,4,5,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,137,0,97.5,95,195.8,82,288.8,78,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,174,0,124.3,76,277.1,112,250.7,115,15.5,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,149,20,264.4,102,219.6,123,200.4,89,11.3,3,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,163,0,191.3,89,193.9,87,268.4,121,12.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,0,184.8,83,248.6,101,133.1,113,9.6,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,89,0,105.9,151,189.6,142,170.9,67,12.7,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,0,239.3,84,195.7,85,232.6,104,10.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,148,14,93.6,137,193.8,72,144.9,84,17.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,118,39,153.8,106,123.3,111,117.8,103,9.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,17,0,161.5,123,214.2,81,315.0,106,8.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,35,37,181.2,76,177.6,98,228.0,136,5.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,98,0,236.2,122,189.4,110,153.6,104,13.3,4,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,98,0,206.5,92,176.2,152,232.8,115,12.4,5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,40,0,220.8,100,265.7,106,212.8,94,6.4,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,93,0,164.5,95,230.9,87,149.9,91,9.9,3,4,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,101,33,219.7,137,186.8,94,184.5,113,9.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,132,0,169.9,107,209.4,121,206.1,79,11.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,150,0,126.0,99,238.5,73,285.1,100,10.2,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,59,0,195.0,58,198.5,88,304.3,110,14.8,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,42,0,180.7,127,174.6,94,165.3,114,12.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,86,0,223.9,75,155.7,109,150.2,143,7.3,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,75,0,117.3,114,201.1,61,107.9,82,12.2,3,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,133.8,85,180.5,94,112.2,115,8.9,4,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,0,153.1,102,234.1,77,329.2,74,9.9,9,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,65,0,277.9,123,155.8,112,256.9,71,9.2,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,113,0,158.9,137,242.8,109,247.8,97,6.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,69,0,135.4,101,238.1,124,195.6,102,10.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,60,0,207.8,109,123.5,112,291.6,115,5.7,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,0,107.8,113,216.6,125,217.5,92,9.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,133,0,162.1,91,212.1,94,260.4,78,12.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,85,0,183.4,111,168.8,98,199.7,97,9.9,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,150,0,189.3,77,220.9,105,238.7,117,9.2,5,4,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,125,0,126.7,108,206.0,90,247.8,114,13.3,7,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,98,0,217.2,121,303.4,73,197.1,71,12.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,0,163.6,88,283.4,93,262.1,108,8.6,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,102,0,156.5,67,204.3,103,141.9,72,9.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,97,25,141.0,101,212.0,85,175.2,138,4.9,2,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,82,0,143.7,116,170.7,99,287.7,95,7.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,1,0,144.8,107,112.5,66,218.7,79,13.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,173,0,109.4,103,101.3,111,167.3,106,7.8,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,32,0,230.9,87,187.4,90,154.0,53,6.3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,62,0,186.8,94,207.6,92,195.0,98,8.8,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,230.9,93,223.0,78,157.8,101,9.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,243,0,95.5,92,163.7,63,264.2,118,6.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,110,0,222.7,94,105.8,98,214.8,78,13.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,0,152.2,112,177.2,132,96.4,87,5.3,3,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,131,28,249.6,87,227.2,138,239.9,92,7.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,81,0,129.9,121,230.1,105,140.5,123,13.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,124,0,188.5,77,182.0,123,218.2,127,6.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,0,75.0,116,248.7,87,176.0,83,9.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,37,39,149.7,122,211.1,75,114.3,90,9.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,107,30,198.9,87,207.0,90,159.8,76,12.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,73,0,240.3,130,162.5,83,231.9,136,11.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,68,0,222.1,107,199.4,102,162.4,107,9.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,148,36,77.6,141,207.0,60,255.7,115,10.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,67,0,167.8,91,167.7,69,110.3,71,8.4,12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,53,37,167.3,99,194.7,99,236.7,112,12.0,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,142,0,187.0,133,134.6,74,242.2,127,7.4,5,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,83,25,191.3,95,250.7,136,249.4,86,17.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,117,0,201.9,86,212.3,96,176.9,98,7.8,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,137,0,147.2,119,192.8,91,172.7,105,10.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,115,0,146.7,128,106.2,74,197.7,104,11.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,71,0,178.2,113,167.8,94,182.1,111,13.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,146,0,206.3,151,148.6,89,167.2,91,6.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,114,32,125.2,79,177.8,105,232.4,89,12.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,107,0,103.4,94,189.3,125,227.2,125,14.4,3,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,0,201.8,82,231.5,95,226.1,130,16.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,126,23,114.3,102,190.3,103,240.4,111,12.6,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,85,21,283.2,110,239.7,108,149.5,80,6.3,1,5,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,43,0,168.4,125,243.8,89,214.7,102,11.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,20,0,186.8,89,253.4,51,273.1,105,12.3,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,133,0,176.8,92,187.5,97,196.8,88,6.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,100,0,210.9,85,329.3,69,127.1,78,9.4,5,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,108,0,169.6,99,264.1,87,206.3,78,9.3,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,155,0,163.0,93,203.9,102,159.0,109,15.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,146,0,189.3,77,155.9,128,186.0,83,7.4,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,98,31,121.0,105,218.9,98,226.7,110,12.0,1,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,80,0,197.5,114,206.9,119,163.6,109,11.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,158.6,112,220.0,114,252.9,106,9.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,0,252.9,93,178.4,112,263.9,105,9.5,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,0,51.1,106,208.6,137,198.0,92,12.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,141,22,215.4,123,328.7,98,160.5,89,7.8,6,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,77,0,239.2,114,150.0,115,160.8,81,10.3,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,143,0,172.3,97,174.0,108,188.2,119,13.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,72,0,147.0,79,162.3,103,162.9,80,10.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,22,240.8,102,75.9,106,224.6,115,7.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,126,0,239.7,87,281.7,92,183.5,113,11.4,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,80,0,198.1,160,156.7,87,182.1,76,9.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,72,0,138.9,111,211.6,102,179.5,91,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,115,0,139.3,89,192.3,95,151.0,75,9.3,3,7,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,43,0,177.2,93,142.6,60,314.1,144,12.7,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,100,0,185.0,122,182.5,92,274.9,92,5.1,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,0,221.3,140,157.8,89,192.5,89,11.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,77,0,189.5,112,207.0,95,214.1,91,9.2,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,88,0,144.3,116,156.4,74,214.7,90,7.8,10,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,55,0,119.7,148,231.8,96,222.3,113,4.6,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,95,32,262.2,123,165.2,82,194.3,57,10.6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,143,0,160.4,120,285.9,104,182.5,85,6.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,142,25,191.1,109,149.6,120,227.8,60,9.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,179,38,220.1,78,234.3,71,237.3,85,10.1,4,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,154,32,192.3,82,165.3,134,205.0,74,9.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,118,0,133.4,113,121.0,92,254.7,129,5.9,4,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,142,0,84.8,95,136.7,63,250.5,148,14.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,193,17,124.0,102,202.9,81,205.1,129,12.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,1 +0,76,0,171.1,78,257.2,83,91.6,92,16.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,106,26,270.3,111,215.2,90,254.0,133,14.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1 +0,116,35,200.4,104,272.8,89,214.5,100,8.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +1,68,29,195.5,113,171.6,96,204.0,85,13.5,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,121,21,126.3,84,209.6,102,192.5,129,10.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,104,0,138.7,107,256.9,113,234.9,74,10.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,157,0,224.5,111,200.7,99,116.6,118,11.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,102,0,195.7,116,209.1,87,201.1,73,8.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,70,0,230.3,110,77.9,87,247.1,105,13.2,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,82,29,163.8,77,134.9,112,79.3,95,8.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,67,0,260.4,107,208.2,104,207.9,115,10.0,2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,75,19,210.3,90,241.8,87,215.7,102,13.1,3,4,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,22,14,199.1,100,221.8,103,65.7,91,4.2,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,116,0,89.5,128,180.8,137,193.1,94,14.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,154,0,166.9,99,154.9,97,189.4,89,7.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,117,0,134.7,121,180.0,83,200.9,104,7.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,0,183.4,103,141.9,113,200.4,122,10.4,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,24,0,156.2,104,90.0,101,205.1,116,7.3,5,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,31,160.3,45,221.5,70,261.6,109,5.6,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,0,1 +0,120,27,153.5,84,194.0,73,256.5,94,10.2,7,5,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,114,31,222.8,98,180.5,105,151.3,101,13.0,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,127,0,261.7,105,181.8,107,100.9,131,3.3,5,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,121,44,116.0,85,150.1,120,246.8,98,12.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,103,29,164.1,111,219.1,96,220.3,108,12.3,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,55,39,139.3,101,178.3,117,246.5,104,8.1,1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,66,16,174.7,92,232.1,105,305.4,98,8.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,130,0,183.0,112,72.9,99,181.8,78,9.5,19,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,33,200.3,75,226.6,67,198.8,91,12.9,3,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,40,41,148.1,74,169.5,88,214.1,102,6.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,0,1 +0,84,0,191.0,88,318.8,119,247.3,79,6.5,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,115,0,206.2,113,176.4,102,297.1,119,11.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,177,0,84.9,77,257.5,109,210.5,66,7.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,51,26,236.8,61,263.4,97,181.1,91,11.2,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,103,18,149.9,84,170.9,84,171.5,112,11.5,7,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,144,18,106.4,109,108.1,113,208.4,111,10.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,21,0,92.6,95,161.9,70,285.0,78,11.3,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,135,28,201.4,100,246.5,117,154.8,131,12.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,111,0,172.8,58,183.1,108,158.8,104,7.9,3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,103,0,255.9,128,140.9,92,308.9,130,12.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,123,0,114.8,94,150.0,104,268.6,119,9.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,96,37,172.7,93,120.1,116,216.1,86,10.3,5,5,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,87,19,165.8,122,186.9,89,249.7,78,0.0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,85,0,126.1,112,274.7,126,184.4,95,9.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,162,0,135.2,98,242.0,107,246.9,96,10.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,101,0,239.0,156,273.0,106,278.2,93,13.5,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,93,0,271.1,101,237.4,133,145.4,103,8.4,6,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,145.5,116,228.4,110,273.4,91,8.9,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,110,0,241.2,105,174.3,85,245.3,59,8.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,106,0,114.4,104,78.3,101,232.7,78,0.0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,91,0,154.4,165,168.3,121,239.9,81,11.7,4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,140.4,112,187.1,60,207.9,155,7.9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,185,0,139.6,92,250.2,115,158.1,79,10.8,4,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,99,0,115.5,75,218.1,111,254.9,98,11.5,7,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,105.8,110,43.9,88,189.6,87,13.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,0,155.2,79,235.3,123,169.4,80,8.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,41,0,223.8,67,244.8,74,223.8,156,12.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,156,0,174.3,95,186.6,128,258.2,105,12.9,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,101,28,105.9,132,231.7,107,281.3,120,10.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,78,0,193.4,99,116.9,88,243.3,109,9.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,184,12,200.3,76,253.6,105,149.3,93,10.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1 +0,68,0,195.4,116,212.1,101,138.4,134,15.1,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,137,0,243.4,114,121.2,110,162.6,104,12.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,159,0,169.8,114,197.7,105,193.7,82,11.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,142,0,145.4,93,209.1,98,214.0,96,10.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,114,0,189.8,101,147.7,80,172.7,121,10.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,0,0.0,0,192.1,119,168.8,95,7.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,113,0,215.6,96,193.4,127,105.4,115,13.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,73,0,254.8,85,143.4,80,153.9,102,15.0,7,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,0,134.8,94,204.1,106,238.4,109,6.7,8,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,55,0,245.5,130,192.7,54,141.7,83,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,187.7,127,163.4,148,196.0,94,9.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,0,239.1,88,243.5,79,230.9,92,10.9,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,111,0,246.5,108,216.3,89,179.6,99,12.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,77,0,144.9,136,151.3,115,252.4,73,12.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,30,144.5,35,262.3,101,226.5,82,12.0,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,53,27,25.9,119,206.5,96,228.1,64,6.5,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,151,0,198.7,70,209.5,106,281.9,126,12.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,117,23,198.1,86,177.0,86,180.5,92,6.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,45,22,196.6,84,313.2,92,163.3,108,11.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,92,0,181.4,98,164.5,98,171.0,110,10.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,169,0,235.7,79,136.9,85,220.9,97,13.3,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,176.4,122,224.9,123,219.6,50,11.5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,36,0,202.4,115,230.7,115,202.0,127,10.2,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,213.5,93,166.6,114,122.0,78,14.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,113,0,155.0,93,330.6,106,189.4,123,13.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,144,38,105.0,86,121.8,123,221.5,122,3.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +1,161,0,322.3,100,230.4,135,241.5,104,7.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,99,0,200.0,66,107.9,104,233.7,82,11.4,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,37,153.5,78,241.9,108,244.7,110,10.6,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,65,0,148.7,80,259.0,94,149.5,107,12.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,121,28,110.0,94,141.5,76,237.3,87,6.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,1 +0,35,0,179.2,59,283.3,101,285.4,83,5.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,100,0,264.5,117,194.0,111,262.7,111,7.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,119,0,230.4,117,225.0,101,198.5,111,7.6,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,6,0,226.5,93,152.1,122,164.4,98,9.4,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,126,0,256.5,112,199.5,90,188.3,122,7.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,170.5,113,193.2,129,188.0,91,11.2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,3,0,139.0,99,250.7,108,286.2,87,6.1,3,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,180,0,139.0,96,224.9,64,170.8,118,15.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,137,0,230.2,113,220.4,79,204.7,111,10.7,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,23,224.2,106,189.6,100,222.8,75,9.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,10,0,186.1,112,190.2,66,282.8,57,11.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,113,0,90.6,130,170.6,100,137.4,74,5.4,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,126,30,153.4,90,151.4,97,153.8,97,12.8,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,166,0,152.1,95,121.0,105,198.0,126,9.8,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,0,205.0,90,140.9,114,272.6,96,7.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,172,0,212.0,121,31.2,115,293.3,78,12.6,10,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,114,25,129.0,77,290.0,110,177.1,110,11.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,36,0,235.1,97,196.8,104,259.7,110,7.0,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,53,0,145.1,116,233.7,82,208.7,95,7.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,166,0,136.1,116,181.4,93,131.4,108,11.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,0,222.4,85,165.4,76,208.4,97,11.2,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,58,0,234.8,89,106.8,131,178.5,122,9.9,6,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,141,37,258.6,84,222.0,111,326.4,97,11.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1 +0,70,0,198.6,111,213.9,115,171.2,105,10.6,6,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,108,0,210.7,112,238.7,73,253.6,90,9.2,5,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,9,31,193.8,130,202.6,98,191.2,102,13.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,148,0,148.2,138,159.6,123,197.4,62,8.6,3,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,111,24,205.5,114,219.3,99,215.9,95,14.0,4,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,137,0,174.4,120,156.3,98,136.5,121,10.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,156,0,237.7,122,181.5,91,185.7,151,7.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,133,0,245.8,102,264.7,90,111.7,103,11.2,7,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,101,0,158.4,92,188.0,117,219.7,125,13.5,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,89,29,163.5,80,274.8,136,381.9,147,7.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,132,0,121.5,88,253.0,124,195.7,120,10.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,173,0,191.4,114,168.5,138,109.3,99,10.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +1,88,0,166.7,61,179.3,88,242.7,131,6.8,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,0,163.4,83,249.3,119,249.7,90,9.8,4,7,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,115,0,286.4,125,205.7,74,191.4,141,6.9,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,141,0,83.2,74,190.6,104,150.5,79,10.7,7,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,0,203.3,45,141.9,87,200.7,71,8.5,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,184,0,236.4,73,287.3,120,192.0,94,13.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,0,180.0,100,229.0,103,139.4,105,7.8,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,48,0,197.7,64,136.7,126,244.4,81,13.2,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,124.8,114,133.0,121,160.3,85,10.6,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,135,0,201.8,81,225.0,114,204.4,82,10.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,71,0,104.0,92,197.0,125,110.1,123,14.6,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,21,19,132.7,94,204.6,101,154.7,78,12.9,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,181,0,161.3,83,124.4,83,262.0,98,14.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,67,34,161.7,114,207.6,115,205.7,114,9.2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,68,0,148.8,70,246.5,164,129.8,103,12.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,140,0,129.6,79,246.2,99,172.1,124,9.4,10,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,61,0,188.9,105,153.6,116,213.3,106,10.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,139,0,138.1,103,164.5,100,134.9,63,8.3,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,32,168.4,129,225.9,97,191.8,95,8.5,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,98,0,158.4,71,306.6,66,144.2,93,2.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,217,0,123.7,138,248.5,105,269.6,78,13.3,4,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,0,137.0,128,217.0,116,182.1,86,10.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +1,68,0,249.9,127,254.5,118,273.2,98,8.9,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,114,0,169.2,96,149.9,83,196.9,119,4.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,149,20,147.8,132,276.8,94,149.9,110,10.2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,129,27,196.6,89,180.6,95,245.0,83,6.6,5,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,67,0,310.4,97,66.5,123,246.5,99,9.2,10,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,126,27,186.2,78,189.6,83,76.5,139,9.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,96,0,98.2,100,307.2,88,182.5,120,7.6,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,85,0,211.5,100,184.6,88,164.3,131,13.3,4,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,76,0,272.7,97,236.4,95,235.5,105,7.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,61,0,78.2,103,195.9,149,108.0,100,10.1,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,110,0,196.1,103,199.7,123,135.9,71,12.9,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,71,0,141.2,132,149.1,90,171.4,72,7.0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,88,0,181.5,116,187.0,119,220.3,96,10.5,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,149,0,180.9,79,194.9,83,197.8,109,8.8,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,164,0,160.6,111,163.2,126,187.1,112,9.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,129,0,159.1,100,202.5,90,233.1,96,11.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,68,0,237.1,105,223.5,105,97.4,79,13.2,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,23,157.6,129,247.0,96,259.2,112,13.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,0,1 +1,117,0,167.1,86,177.5,87,249.4,132,14.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,1,0 +0,120,40,128.1,99,247.7,78,199.7,121,15.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,70,0,208.7,97,275.5,83,182.5,122,8.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,143,0,167.8,72,211.0,99,153.5,109,10.5,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,78,0,168.3,110,221.2,73,241.0,136,12.5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,147,0,248.6,83,148.9,85,172.5,109,8.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,95,0,237.3,83,154.0,65,237.0,105,11.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,48,0,210.8,84,189.6,98,157.6,99,16.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,25,0,178.8,90,141.2,72,203.0,99,8.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,166,0,47.7,89,264.4,95,235.2,97,13.2,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,81,31,210.4,100,225.5,97,168.7,120,9.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,155,39,183.3,106,205.1,101,263.7,90,5.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,174,0,239.2,72,188.5,124,105.6,116,8.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,116.2,86,229.7,127,204.2,109,10.1,3,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +1,169,0,266.7,105,158.2,88,287.7,111,13.8,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,0,110.9,54,213.4,82,186.2,116,7.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,81,0,154.5,84,216.2,91,229.8,82,13.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,105,0,115.5,73,267.3,83,114.2,90,13.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,181,0,143.3,91,195.5,58,223.3,95,6.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,89,35,174.4,108,196.7,100,127.4,74,11.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,52,0,214.7,68,158.6,138,123.4,114,9.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,101,0,153.8,89,234.0,89,196.3,77,11.6,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,127,0,107.9,128,187.0,77,218.5,95,0.0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,44,254.1,127,180.2,108,196.2,129,8.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1 +0,25,0,134.3,98,202.3,109,195.9,100,12.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,121,0,168.6,121,168.6,94,95.3,59,12.3,4,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,97,0,211.0,76,189.0,100,123.0,102,4.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,90,0,37.8,80,155.3,105,175.0,111,14.2,5,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,100,26,153.7,115,137.8,146,213.5,104,15.9,5,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,75,27,117.5,102,206.8,127,194.4,114,4.2,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,121,0,167.7,94,93.7,121,241.3,115,13.4,1,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,88,0,152.9,119,171.2,107,257.0,106,12.0,5,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,127,19,129.7,115,160.8,101,265.0,63,12.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,7,0,206.7,87,281.1,83,158.5,77,11.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,136.2,92,220.9,110,196.9,116,13.3,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,65,0,187.9,116,157.6,117,227.3,86,7.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,118,0,160.0,123,175.4,96,184.8,99,9.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,100,0,131.1,108,176.2,81,89.7,81,4.3,4,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,81,0,145.6,59,287.9,131,181.7,121,9.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,92,38,242.2,96,159.7,144,210.0,108,8.9,1,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,76,0,204.0,69,225.1,110,240.3,85,9.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,141,0,192.4,111,156.9,87,175.8,82,11.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,32,154.0,80,185.5,91,148.2,107,8.2,4,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,190,0,142.9,96,177.9,96,113.3,117,6.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,125.4,158,269.1,83,238.6,103,11.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,86,23,225.5,107,246.3,105,245.7,81,9.8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,137,0,141.1,91,147.2,100,254.7,75,8.0,7,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,0,187.3,84,270.8,95,206.4,68,10.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,79,0,147.0,72,165.7,102,243.2,107,8.4,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,177,0,175.7,120,168.6,90,198.9,110,14.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,45,0,142.4,107,318.7,78,224.1,108,11.1,7,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,10,0,183.0,103,214.8,77,206.4,73,8.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,25,215.9,90,257.9,92,180.2,157,11.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,1 +0,86,38,123.0,158,133.9,119,138.2,103,13.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +1,126,0,249.8,96,261.9,92,166.8,108,12.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,99,0,142.3,89,204.5,95,203.1,114,9.1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,25,0,119.3,87,211.5,101,268.9,86,10.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,104,0,278.4,106,81.0,113,163.2,137,9.8,5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,62,0,128.7,111,169.5,104,193.6,97,10.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,70,0,177.4,125,226.2,104,254.1,72,10.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,37,0,206.0,89,186.0,88,307.1,86,8.4,11,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,0,247.3,91,182.7,60,143.2,112,14.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,224.9,102,143.8,87,198.9,105,8.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,192,36,156.2,77,215.5,126,279.1,83,9.9,6,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,99,0,145.6,102,230.9,87,181.5,86,11.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,58,0,174.4,112,265.8,122,182.4,87,0.0,0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,125,0,163.8,73,255.6,85,192.9,95,15.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,0,256.5,87,222.1,101,156.7,122,13.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,166,35,128.2,138,274.5,113,298.9,130,8.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,0,1,0,1 +0,130,0,176.3,140,201.0,104,161.9,123,11.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,135,0,186.0,107,66.0,94,213.1,105,12.9,4,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,145,31,216.0,94,225.1,123,234.7,109,10.7,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,0,1 +0,82,19,146.5,73,246.4,65,199.0,114,4.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,107,0,222.3,101,286.0,111,249.4,117,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,23,253.0,78,138.9,121,277.8,104,11.8,3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,151,26,196.5,98,175.8,111,221.8,124,13.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,131,0,109.5,95,332.1,48,258.6,108,6.6,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,82,0,143.9,61,194.9,105,109.6,94,11.1,2,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,141,32,148.6,91,131.1,97,219.4,142,10.1,1,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,158,0,220.9,129,242.2,108,233.3,75,6.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,107,0,260.5,108,102.4,110,129.7,148,9.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,119,0,190.4,74,215.6,113,161.2,111,10.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,134,0,208.3,86,253.6,89,291.0,86,12.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,39,36,141.7,121,232.3,113,222.1,131,12.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,85,0,216.0,73,188.2,117,147.1,98,3.6,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,60,0,145.0,133,209.1,92,328.5,112,14.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,0,287.1,115,159.3,99,216.8,86,13.9,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,63,0,164.5,75,147.9,118,252.7,97,11.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,69,37,155.0,98,142.4,105,143.7,117,5.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,71,0,277.5,104,131.8,121,126.9,101,8.2,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,72,0,207.8,92,195.7,110,184.8,124,13.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,217.1,76,205.2,100,185.7,91,9.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,142,38,163.3,104,136.0,114,249.1,127,4.3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,74,0,124.0,102,262.1,101,268.2,98,11.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,21,31,135.9,90,271.0,84,179.1,89,9.5,7,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,165,33,111.6,140,213.3,111,267.6,115,16.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,73,0,254.7,80,90.2,79,153.4,60,10.6,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,176,0,169.5,151,112.9,84,56.6,99,8.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,41,0,182.1,89,211.5,104,207.4,124,6.8,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,91,0,133.8,61,158.8,96,189.6,92,10.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,157,28,192.6,107,195.5,74,109.7,139,6.8,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,106,33,81.6,120,235.6,85,150.9,113,9.9,4,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,83,0,259.7,106,152.7,116,224.7,92,10.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,47,27,165.0,89,127.3,118,284.4,95,7.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,0,1 +0,60,0,203.2,99,235.8,131,224.9,112,15.1,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,197,0,154.8,111,171.5,102,227.3,86,10.6,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,45,0,159.8,91,120.4,86,163.0,93,10.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,105,29,220.7,82,217.7,110,190.5,100,13.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +1,75,0,211.3,61,105.6,119,175.9,63,9.7,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,98,23,245.5,54,292.7,83,184.0,90,10.8,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,59,28,120.9,97,213.0,92,163.1,116,8.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,113,34,44.9,63,134.2,82,168.4,118,13.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +1,163,0,247.7,77,269.5,108,167.3,82,9.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,113,0,186.0,55,237.4,105,148.1,83,12.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,138,33,155.2,139,268.3,79,186.4,71,9.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,99,0,198.2,87,207.3,76,190.9,113,8.7,3,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,40,0,109.4,107,244.7,102,276.9,123,7.1,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,130,26,257.2,108,224.3,122,204.0,118,12.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,90,0,205.7,138,161.9,83,269.7,104,12.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +1,161,0,154.7,84,177.8,125,172.9,90,5.9,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,60,0,98.2,88,180.5,69,223.6,69,9.3,2,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,55,0,194.1,121,176.6,110,302.8,136,7.0,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,87,0,198.3,80,187.0,89,133.5,96,16.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,60,0,183.0,110,206.7,93,203.8,119,11.1,6,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,160,0,166.8,109,236.0,117,307.6,77,9.3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,24,0,243.0,91,183.9,77,184.3,109,15.3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,44,0,228.1,121,276.5,79,279.8,77,9.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,66,0,34.0,133,278.6,61,129.6,120,11.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,29,179.9,97,189.2,89,164.3,76,12.8,7,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,76,0,198.3,130,217.1,86,188.4,96,12.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,193,0,96.8,92,142.6,103,210.1,115,10.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,62,0,321.1,105,265.5,122,180.5,72,11.5,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,95,0,229.9,116,202.4,110,171.4,105,14.2,6,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,47,37,163.5,77,203.1,102,232.0,87,7.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,94,0,137.5,118,203.2,88,150.0,131,13.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,177,0,266.1,91,225.2,79,224.7,58,8.9,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,92,23,167.4,83,258.6,129,116.4,110,11.2,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,111,0,191.3,80,138.5,94,246.0,107,6.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,147,0,130.6,83,208.1,144,204.6,72,15.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,58,0,238.9,107,187.2,88,181.1,84,11.8,3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,149.8,112,180.0,93,140.0,119,11.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,145,0,241.7,137,135.8,100,277.6,123,13.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,93,19,136.8,113,179.5,105,71.1,95,12.5,3,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,95,0,228.9,134,255.7,71,208.0,120,10.1,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,156,0,150.5,106,152.9,112,215.9,86,3.5,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,75,24,225.5,119,182.0,108,270.9,106,9.4,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,67,41,174.7,86,160.6,93,155.3,108,13.4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,172,0,270.0,102,256.6,111,168.5,104,12.0,5,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,95,22,40.9,126,133.4,90,264.2,91,11.9,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,146,0,195.9,86,228.6,82,303.5,94,12.2,4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,210,0,104.6,121,149.5,71,255.1,67,6.5,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,0,218.9,88,208.0,85,203.3,99,11.1,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,150,0,178.9,101,169.1,110,148.6,100,13.8,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,37,0,134.9,98,248.4,130,236.2,113,14.7,2,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,126,0,228.7,102,168.7,99,223.5,100,11.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,97,0,276.1,82,201.1,106,231.3,73,8.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +1,76,0,129.7,84,177.5,80,228.9,87,7.5,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,138,37,75.8,102,173.6,147,162.6,96,8.2,13,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,129,0,186.0,127,262.3,96,98.9,63,11.5,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,122,0,168.3,96,87.6,91,247.2,87,8.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,96,0,170.5,86,277.5,88,162.5,117,12.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,68,24,125.7,92,275.9,98,214.5,108,14.2,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,185,0,55.6,97,288.7,83,111.2,110,12.1,3,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,0,238.0,97,164.5,97,282.5,132,10.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,83,36,95.9,87,261.6,105,228.6,109,13.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,111,0,214.3,118,208.5,76,182.4,98,12.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,85,0,209.8,82,194.5,94,200.4,85,11.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,120,0,192.6,123,206.4,105,283.2,93,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,88,0,138.3,116,236.0,138,179.1,110,9.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,40,0,170.7,55,179.1,108,281.9,89,8.2,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,153,0,166.8,127,143.5,121,210.7,130,11.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,50,0,154.7,102,298.0,108,210.2,95,11.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,61,0,197.7,118,152.2,96,221.0,93,7.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,161.1,99,198.8,81,228.4,116,10.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,205,0,49.9,123,150.7,81,188.2,67,10.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,99,0,222.4,102,185.8,89,237.7,81,12.0,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,194.4,83,247.8,84,245.4,93,11.2,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,112,0,115.8,108,243.3,111,184.6,78,13.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,0,51.8,107,230.2,104,227.5,118,10.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,90,29,150.1,109,264.7,103,178.4,97,5.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,1,0,196.1,107,296.5,82,211.5,91,7.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,155,0,203.4,100,190.9,104,196.0,119,8.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,0,158.8,75,264.8,91,270.0,77,7.6,7,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,78,32,210.3,116,192.2,83,246.1,92,10.8,4,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,63,0,185.3,87,225.3,87,194.3,93,11.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,58,0,131.9,96,167.6,107,205.9,106,14.7,5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,113,0,61.2,111,92.3,88,197.4,114,13.7,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,83,0,178.8,102,167.9,84,178.9,65,8.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,33,146.6,87,114.8,59,220.4,99,2.9,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,100,0,68.5,110,337.1,115,205.2,99,12.1,9,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,90,29,185.6,106,219.7,113,152.1,120,11.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,80,0,189.1,122,223.2,92,269.0,116,13.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,132,0,193.3,106,128.3,94,162.1,119,11.6,4,5,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,79,0,213.6,110,234.9,121,229.6,157,8.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,0,158.7,90,198.4,117,181.1,76,10.5,4,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,141,32,322.4,92,283.2,107,209.5,111,6.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,128,0,268.1,95,120.5,126,220.8,121,14.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,109,0,218.9,105,299.9,87,158.6,110,11.3,4,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,0,214.3,145,268.5,135,241.2,92,10.8,13,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,21,153.2,112,263.3,110,135.0,85,11.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,84,0,217.1,99,236.0,68,118.3,120,9.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,162,0,49.2,121,143.9,136,203.0,97,12.1,13,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,81,0,166.2,102,217.6,112,220.2,68,13.2,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,63,0,128.7,78,240.8,133,237.7,121,12.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,69,0,185.3,91,219.1,88,243.6,107,5.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,80,0,268.7,120,301.0,147,167.0,140,5.8,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,79,0,261.7,97,210.6,48,256.7,83,6.0,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,78,25,197.4,73,295.7,113,211.7,73,13.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,79,21,264.3,79,202.8,118,173.4,92,6.3,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,144,48,189.8,96,123.4,67,214.2,106,6.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,105,29,179.4,113,275.4,100,246.1,105,10.0,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,73,0,160.1,110,213.3,72,174.1,72,13.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,145,43,257.7,97,162.1,95,286.9,86,11.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,59,0,160.9,95,251.2,65,273.4,97,5.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,0,142.3,79,158.0,113,177.5,75,6.0,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,74,0,221.1,124,110.8,94,240.1,112,10.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,177.1,112,194.0,112,146.7,108,5.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,70,0,104.7,112,82.2,104,169.4,110,15.8,7,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,37,205.0,94,165.4,103,185.0,81,11.7,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,69,31,194.9,63,191.6,90,153.0,129,13.2,2,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,86,0,176.3,79,259.2,97,287.4,78,6.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,135,27,273.4,141,154.0,99,245.8,112,12.3,6,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,91,0,123.8,107,319.0,125,237.6,78,7.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,93,0,266.0,120,130.1,84,165.8,63,13.1,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,84,0,280.0,113,202.2,90,156.8,103,10.4,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,115,29,222.6,81,190.3,109,201.2,87,11.5,2,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,84,0,159.5,125,247.1,90,187.9,82,7.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,84,0,169.5,96,157.6,94,98.2,70,10.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,161,0,221.7,95,193.0,82,194.1,113,6.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,155,0,181.4,111,167.7,92,168.5,122,11.3,3,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,26,217.2,138,145.5,111,280.7,76,9.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +1,77,44,103.2,117,236.3,86,203.5,101,11.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,72,0,272.4,88,107.9,125,185.5,81,12.7,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,19,0,237.7,98,207.1,121,182.2,95,4.5,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,138,26,183.9,83,240.7,93,185.7,125,15.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,132,0,181.1,121,314.4,109,246.7,81,4.2,9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,61,16,143.5,76,242.6,58,147.7,95,11.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,120,24,227.5,81,234.9,71,166.4,128,9.0,13,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,204,0,174.3,85,254.1,95,176.4,96,5.9,3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,83,20,95.0,89,167.9,92,200.6,79,11.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,108,0,193.3,126,154.7,85,174.8,98,9.4,6,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,0,159.1,114,231.3,117,143.2,91,8.8,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,0,204.9,107,135.2,102,208.2,106,10.4,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,70,0,7.9,100,136.4,83,156.6,89,12.1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,100,0,219.4,112,225.7,102,255.3,95,12.0,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,0,1,1,0 +0,123,27,218.7,79,163.4,78,173.8,116,15.0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,173,0,291.8,143,214.3,134,151.2,119,9.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,153,28,235.6,74,227.9,37,170.3,103,15.4,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,0,1 +0,76,0,204.2,100,292.6,139,244.3,105,10.5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,83,41,155.9,122,162.3,107,127.6,105,13.1,5,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,115,14,192.3,86,88.7,90,229.4,120,10.5,3,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,20,0,190.0,109,258.2,84,181.5,102,6.3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,34,14,151.5,100,248.7,126,199.8,120,10.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,60,0,221.1,106,178.6,48,202.7,90,7.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,184,0,213.8,105,159.6,84,139.2,137,5.0,10,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,174,0,192.1,97,169.9,94,166.6,54,11.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,88,27,93.4,106,252.0,92,189.0,104,10.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,225,0,182.7,142,246.5,63,218.0,103,8.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,73,0,213.0,95,188.8,104,136.2,89,13.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,93,0,179.5,121,191.9,131,165.5,125,12.0,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,0,142.2,107,262.4,84,139.2,99,10.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,128,0,216.0,111,153.7,115,227.0,74,12.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,122,0,230.9,132,243.2,99,182.4,57,11.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,41,0,237.8,92,223.5,155,217.4,90,10.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,17,167.9,114,243.7,93,211.9,114,9.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,60,0,205.9,97,277.4,117,202.0,139,11.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,1,0 +0,65,0,158.8,53,188.5,132,189.3,87,9.8,4,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,76,0,186.1,96,211.6,100,230.6,100,8.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,40,0,169.7,115,141.4,123,253.0,115,10.5,3,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,67,28,95.0,94,291.2,73,159.6,114,10.0,2,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,160,0,82.7,116,194.6,95,159.0,54,10.9,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,82,0,154.0,107,94.4,114,287.6,95,10.1,7,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,102,25,137.4,100,176.7,83,188.2,93,10.2,6,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,97,0,236.9,107,157.6,105,241.0,120,7.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,1,0 +0,91,39,169.8,105,65.2,116,144.4,92,10.9,4,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,102,0,129.5,56,354.2,118,145.5,93,10.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,0,158.0,110,197.0,103,154.9,132,10.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,190,0,111.9,55,223.0,124,243.2,81,10.0,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,98,0,136.1,82,156.3,118,158.8,83,10.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,150,35,139.6,72,332.8,170,213.8,105,8.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,141,0,126.9,98,180.0,62,140.8,128,8.0,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,41,37,239.8,110,221.9,115,189.1,100,7.3,1,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,98,31,181.6,112,220.7,100,236.3,121,12.9,4,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,130,0,154.2,119,110.2,98,227.4,117,9.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,111,0,123.1,88,213.9,84,184.9,88,12.0,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,105,0,228.4,100,145.1,108,245.3,140,7.7,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,95,0,165.5,84,286.2,112,198.9,89,11.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,131,0,196.1,89,185.5,87,250.0,132,5.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,194,0,162.3,88,213.7,118,192.1,81,10.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,114,0,187.8,109,154.6,97,213.9,102,10.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,13,0,193.2,89,194.4,90,186.5,104,9.7,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,113,0,272.1,111,268.5,118,213.8,105,8.5,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,154.3,107,183.0,111,54.0,134,10.9,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,44,152.0,95,274.9,73,162.4,121,10.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,98,0,126.3,102,166.8,85,187.8,135,9.4,2,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,155,0,262.4,55,194.6,113,146.5,85,8.3,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,112,16,221.6,110,130.2,123,200.0,108,11.3,3,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,125,0,212.3,89,215.4,127,186.8,73,11.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,68,39,142.0,140,241.6,89,302.0,72,11.3,5,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,87,0,171.6,119,205.0,107,170.6,114,13.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,61,20,254.4,133,161.7,96,251.4,91,10.5,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,81,33,161.6,117,123.0,90,261.3,101,12.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,142,0,232.1,102,168.2,110,197.3,120,9.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,159,23,153.6,93,216.9,88,161.3,91,12.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,161,0,189.6,78,267.4,117,184.5,137,1.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,1,0 +0,104,0,160.4,73,293.9,103,306.6,90,12.6,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,55,0,269.6,121,171.7,91,219.0,98,8.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,117,0,102.3,100,135.2,104,199.7,93,15.7,10,3,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,25,96.2,112,178.9,70,182.1,84,12.9,10,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,59,0,179.4,80,232.5,99,175.8,105,14.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,113,23,149.0,104,235.8,67,201.8,76,9.5,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +0,100,0,191.9,95,200.9,101,271.9,74,18.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,77,0,62.4,89,169.9,121,209.6,64,5.7,6,5,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,107,0,241.9,102,126.9,117,185.6,92,11.7,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,203.4,110,128.7,97,190.5,113,11.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,89,0,115.4,99,209.9,115,280.9,112,15.9,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,97,0,143.7,117,273.0,82,178.3,81,10.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,147,0,209.4,104,132.5,78,149.4,123,11.3,3,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,93,0,152.4,74,274.6,88,252.2,120,6.6,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,62,0,245.3,91,122.9,130,228.4,102,8.5,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,104,0,225.9,123,162.8,106,272.1,85,10.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,165,0,242.9,126,209.8,65,228.4,126,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,93,0,271.6,71,229.4,108,77.3,121,10.9,3,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,39,0,160.4,68,102.6,103,235.3,106,9.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,86,0,226.3,88,223.0,107,255.6,92,13.0,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,36,0,157.6,117,184.3,58,240.4,99,11.9,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,147,38,243.4,126,273.8,109,282.9,91,14.1,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,85,0,165.8,96,190.0,141,144.0,116,10.9,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,138,21,19.5,149,140.9,109,179.7,111,7.9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,55,0,285.7,124,230.9,106,230.7,140,14.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,28,0,159.7,79,216.7,131,206.7,116,9.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,166.9,98,221.8,77,243.9,114,12.8,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,23,232.4,97,186.0,88,190.5,128,12.3,3,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,66,0,154.0,133,198.9,121,151.9,100,9.5,3,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,31,0,177.3,129,152.8,105,162.9,92,5.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,75,0,222.4,78,327.0,111,208.0,104,8.7,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,104,18,182.1,66,213.6,65,193.0,108,13.4,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,103,0,190.9,62,226.6,53,230.1,96,7.8,3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,58,29,240.4,80,118.9,91,164.2,108,11.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,84,0,225.9,86,275.6,105,201.4,108,14.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,216.8,86,190.8,114,187.5,79,11.0,9,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,76,22,160.1,107,168.7,136,23.2,102,9.5,4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,118,0,253.2,122,201.0,78,195.3,108,9.7,7,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,48,37,211.7,115,159.9,84,144.1,80,12.2,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,132,0,190.1,105,182.2,116,279.8,105,13.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,79,0,156.9,109,122.2,87,189.1,103,11.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,225.0,81,176.9,63,194.3,110,7.1,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +1,68,0,148.5,126,219.4,125,198.5,121,14.5,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,0,1,1,0 +0,57,29,279.9,121,223.1,109,251.7,94,13.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,68,0,213.9,112,260.5,100,233.8,97,8.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,107,0,133.3,106,182.9,89,241.1,123,12.9,2,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,114,0,147.1,119,161.0,111,275.9,106,9.0,3,5,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,222.8,101,203.0,128,210.6,106,6.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,98,0,0.0,0,159.6,130,167.1,88,6.8,1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,174,0,235.5,108,142.3,143,316.7,131,12.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,132,31,174.5,101,245.6,105,172.8,76,10.3,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,76,0,203.6,61,161.7,127,175.9,97,8.4,3,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,117,0,287.4,118,259.6,84,153.2,86,10.0,3,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,67,0,115.5,70,252.2,143,208.9,91,7.5,6,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,29,244.3,140,322.3,89,166.8,83,10.6,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,122,30,230.1,108,287.6,76,177.1,85,6.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,112,0,335.5,77,212.5,109,265.0,132,12.7,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,136.4,104,202.5,110,230.7,86,11.5,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,88,0,202.2,86,216.8,93,239.4,99,11.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,148.4,95,193.8,98,206.0,106,6.9,6,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,62,33,186.4,84,201.0,136,286.7,103,11.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,151,17,214.7,97,138.5,90,169.1,44,8.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,113,0,239.7,47,282.9,110,238.4,88,8.7,3,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,0,293.0,88,160.6,101,143.9,87,10.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,0,193.7,108,186.6,98,223.0,100,11.6,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,132,15,154.6,128,245.6,106,148.6,90,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,66,40,141.7,87,268.3,89,241.3,68,8.5,7,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,115,0,195.9,111,227.0,108,313.2,113,13.2,1,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,38,25,142.4,106,313.7,109,126.6,117,13.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,89,0,137.9,96,192.6,63,255.7,125,11.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,99,0,62.9,81,231.0,64,168.9,121,8.5,5,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,66,32,187.8,117,129.8,90,132.3,113,12.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,60,0,180.3,67,208.0,68,181.2,101,12.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,157,30,194.3,107,243.2,108,322.2,114,7.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,106,31,197.4,125,123.4,110,115.6,101,12.3,4,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,16,0,174.7,83,280.8,122,171.7,80,10.5,8,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,60,0,193.9,118,85.0,110,210.1,134,13.2,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,89,0,134.9,59,156.0,152,197.5,112,10.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,13,0,58.4,121,262.2,64,159.0,115,11.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,0,168.6,87,259.2,105,279.8,123,7.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,73,0,286.4,109,178.2,67,214.2,152,10.7,14,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,0,208.4,120,174.4,99,310.7,105,11.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,177,27,230.2,106,196.1,78,215.4,108,10.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,123,27,198.7,127,249.0,105,173.2,124,12.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,62,42,137.3,95,184.2,94,231.4,70,10.2,3,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,78,0,149.7,119,182.2,115,261.5,126,9.7,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,206.1,49,224.6,115,256.7,74,13.0,1,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,79,0,157.6,85,194.1,92,231.5,86,9.4,10,5,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,131,39,69.1,122,101.3,136,104.8,94,9.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,185,30,154.1,114,118.7,106,258.4,105,12.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1 +0,81,36,115.9,120,236.6,95,255.0,90,11.7,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,77,24,149.4,74,123.9,72,174.3,84,10.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,141,28,206.9,126,264.4,126,171.8,124,9.3,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,116,27,175.5,137,210.6,60,294.8,121,6.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,172,0,169.8,123,183.1,94,395.0,72,12.7,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,136,21,179.4,88,181.1,97,320.7,120,9.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,0,234.4,61,179.3,111,285.5,117,10.4,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,96,45,248.8,124,140.3,77,263.6,102,10.3,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,155,23,243.9,112,133.0,106,213.7,123,13.4,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,168,0,183.2,131,179.2,73,292.8,100,9.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,16,0,153.2,65,229.7,90,148.2,94,10.7,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,148,0,243.0,115,191.8,91,117.8,93,13.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,122,0,144.2,87,212.2,74,169.3,87,9.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,96,29,150.0,91,159.4,75,228.1,55,8.5,3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,50,22,252.9,112,177.9,99,158.4,146,8.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,64,0,194.2,147,173.4,87,268.7,114,5.5,2,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,35,230.5,116,265.8,130,269.7,69,10.6,6,5,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,90,30,183.8,76,229.7,95,144.1,124,7.7,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,69,33,271.5,98,253.4,102,165.4,85,8.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +1,95,0,269.0,120,233.7,120,179.3,61,7.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,37,0,271.7,112,155.1,96,199.5,97,6.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,87,0,177.2,72,248.9,105,200.8,87,8.6,7,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,171.7,99,174.8,87,189.6,130,7.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,75,0,150.6,99,301.5,83,158.7,104,8.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,11,38,209.8,130,196.6,84,233.0,79,7.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,79,0,236.8,135,186.4,87,126.9,112,10.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,54,24,92.3,88,193.1,98,99.3,119,11.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,117,0,191.1,93,282.8,56,84.8,118,12.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +1,149,0,119.2,88,168.3,110,204.7,119,12.2,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,40,202.6,103,118.8,128,234.9,98,9.0,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,65,0,207.7,109,217.5,117,125.6,111,8.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,82,0,300.3,109,181.0,100,270.1,73,11.7,4,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,220.3,124,188.6,101,278.4,98,10.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,120,27,128.5,115,163.7,91,242.9,121,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,182,0,104.9,111,198.5,120,258.2,91,8.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,151.5,89,131.7,78,235.3,131,11.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,117,17,221.3,82,167.6,100,262.7,87,4.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,78,0,236.8,141,265.3,101,152.4,77,9.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,78,0,208.9,119,252.4,132,280.2,120,12.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,0,157.0,113,256.9,97,185.5,126,12.1,2,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,0,204.4,135,219.1,90,222.7,114,10.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,1,0 +0,70,0,147.1,105,200.0,135,234.9,65,12.5,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,54,0,210.5,102,204.5,83,127.8,53,8.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,88,45,80.3,140,153.3,101,309.2,123,12.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,92,0,157.7,101,298.6,100,216.9,99,13.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,149,0,166.6,61,218.8,107,208.3,131,8.2,6,7,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,122,27,253.7,84,229.2,109,190.5,123,9.2,5,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,145,24,147.5,90,175.7,108,252.1,102,15.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,117,0,119.0,82,187.5,108,189.3,97,11.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,133,0,216.2,67,222.2,133,192.0,95,3.1,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,111,0,284.4,89,157.0,113,242.8,91,8.4,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,160,0,234.9,136,270.8,134,219.3,101,13.9,2,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,159.5,77,303.8,92,226.9,120,12.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,127,25,82.2,95,163.3,109,264.9,104,5.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,138,0,268.4,81,174.4,115,193.5,96,11.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,65,0,195.4,110,181.2,109,178.5,105,8.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,80,0,127.8,67,181.6,112,197.3,63,15.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,1,26,208.0,115,185.0,113,177.7,144,8.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,87,0,228.7,90,163.0,99,154.1,90,11.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,92.8,92,159.6,87,148.7,115,8.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,89,25,215.1,140,197.4,69,162.1,117,10.6,10,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,153,0,228.9,102,160.7,136,203.1,109,12.5,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,29,0,313.2,103,216.3,151,218.4,106,12.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,63,32,30.9,113,187.0,113,230.8,101,8.6,7,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,73,0,187.8,95,149.2,143,201.4,113,11.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,42,0,184.5,98,200.5,93,279.2,91,8.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,221.8,97,203.8,134,215.8,154,8.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,105,0,211.1,99,176.7,66,221.5,96,14.7,7,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,97,32,90.0,87,276.3,113,185.2,107,8.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,121,0,103.3,110,129.1,82,167.1,113,10.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,15,0,135.2,101,152.5,79,224.8,83,8.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,0,199.3,86,194.8,102,298.2,82,14.3,2,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,130,0,271.8,129,237.2,128,210.1,91,8.7,2,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,0,214.2,90,196.8,78,157.9,112,5.9,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,120,0,98.2,99,186.7,85,146.7,96,9.3,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,94,28,233.2,88,113.3,102,118.0,71,16.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,138,0,230.1,107,212.0,120,174.9,119,13.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,149,0,176.2,87,145.0,81,249.5,92,5.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,88,0,181.9,90,151.5,87,143.0,100,7.5,3,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,253.2,89,237.9,114,154.3,85,9.7,7,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,209,0,227.2,128,258.4,92,183.5,74,8.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,134,38,214.4,93,211.7,57,165.0,79,10.0,8,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,116,0,189.5,90,189.8,118,205.8,83,13.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,32,26,266.7,109,232.3,107,212.8,98,16.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,0,1 +0,129,0,207.0,91,154.9,121,245.1,112,13.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +1,85,17,89.8,88,233.2,75,165.7,116,9.3,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,54,33,161.8,73,273.0,58,153.9,76,13.7,4,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,0,157.4,107,167.8,112,188.8,102,8.8,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,110,0,148.5,115,276.4,84,193.6,112,12.4,3,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,37,0,233.7,114,207.9,109,212.7,101,12.0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,101,36,123.7,125,172.6,106,280.5,127,8.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,95,0,175.2,91,244.4,109,75.8,95,7.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +1,93,0,312.0,109,129.4,100,217.6,74,10.5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,117,0,168.8,137,241.4,107,204.8,106,15.5,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,66,0,118.0,133,248.1,99,214.4,122,5.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,142,40,230.7,101,256.8,88,263.9,92,6.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,140,27,188.9,124,160.9,102,197.7,100,11.5,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,68,22,82.5,97,289.9,94,180.0,114,4.8,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,2,0,132.1,42,138.9,88,192.6,119,9.1,1,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,83,32,94.7,111,154.4,98,200.4,109,10.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,154,35,224.0,102,192.0,99,163.1,100,9.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,147,0,124.4,74,320.9,78,157.2,126,10.4,4,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,206.9,85,244.7,78,221.5,136,7.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,131,0,263.4,123,151.9,74,218.5,101,10.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,188,26,198.8,115,166.6,67,198.5,118,14.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,127,0,139.6,94,240.9,112,127.1,88,8.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,106,30,220.1,105,222.2,109,158.4,96,13.1,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,0,254.3,113,78.9,104,153.2,69,11.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,179,0,219.2,92,149.4,125,244.7,104,6.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,128,18,222.1,89,160.6,109,218.8,102,13.6,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +1,76,0,241.0,120,231.8,96,220.2,67,9.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,93,0,151.4,89,186.4,76,172.5,120,10.9,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,124,0,194.0,103,241.0,116,227.5,153,11.9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,130,0,212.8,102,189.8,137,170.1,105,10.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,29,215.5,129,161.9,77,128.3,91,8.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,86,0,162.4,131,167.0,102,128.9,118,11.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,22,0,181.8,108,198.6,148,206.6,96,9.3,3,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,102,0,224.2,81,243.3,90,147.8,66,12.0,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,93,0,173.0,131,190.4,108,290.0,66,10.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,201.4,52,229.4,104,252.5,106,12.0,3,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,112,16,200.3,72,197.8,91,151.1,92,10.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,137,0,215.9,76,145.4,118,186.9,129,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,144,0,150.0,69,285.9,73,190.6,121,9.4,15,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,94,38,170.1,124,193.3,116,105.9,73,12.8,4,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,129,32,211.0,99,155.1,89,234.8,96,11.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,40,0,242.5,82,232.9,97,154.0,86,9.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,118,35,256.3,119,258.1,91,215.5,130,11.7,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,115,0,200.2,92,244.9,107,190.9,96,8.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,167,0,244.8,91,60.8,105,176.7,110,10.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,139,0,165.0,132,249.7,86,170.3,128,12.6,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,34,142.3,73,194.8,79,239.3,81,16.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,67,0,120.9,58,235.0,88,95.1,130,11.4,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,166,0,173.9,103,276.4,83,190.8,113,15.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,74,0,203.8,77,205.1,111,154.9,109,9.0,2,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,145,0,39.5,78,264.3,106,185.8,90,10.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,224,0,111.4,133,175.0,66,217.2,106,5.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,115,0,178.7,114,271.0,96,245.9,94,16.4,5,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,114,0,193.0,101,250.0,81,133.3,79,9.6,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,119,23,154.0,114,278.0,137,228.4,112,11.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,106,0,208.3,89,169.4,67,102.0,90,15.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,137,0,258.0,112,246.5,117,173.2,100,10.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,32,26,243.5,137,236.8,108,173.3,149,9.0,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,74,0,85.7,83,247.7,67,142.4,85,10.1,5,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,108,42,130.1,90,167.0,128,244.7,80,13.6,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,110,0,178.5,124,146.9,141,217.1,102,9.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,115,0,166.5,111,236.2,98,205.6,92,15.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,114,0,172.0,145,276.4,101,193.7,100,10.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,86,0,216.3,96,266.3,77,214.0,110,4.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,78,0,108.6,108,209.9,126,222.6,117,7.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,136,0,101.7,105,202.8,99,136.2,119,9.4,6,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,224,0,171.5,99,160.0,103,212.4,102,5.0,2,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,153,0,154.6,56,263.0,84,367.7,89,15.5,2,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,99,0,242.3,102,350.9,102,163.1,93,11.3,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,127,0,202.1,103,229.4,86,195.2,113,11.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,133,39,239.9,107,253.8,77,128.7,85,6.7,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +0,119,0,98.8,97,146.9,68,190.7,105,10.0,4,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,36,96.3,83,179.6,91,166.3,121,10.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,116,17,193.4,112,240.6,131,248.1,98,11.4,3,5,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,78,0,162.3,116,192.4,86,240.6,100,10.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,141,0,151.5,104,242.2,114,304.2,109,10.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,114,0,155.3,75,169.9,87,207.0,133,12.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,0,218.0,86,184.0,94,240.5,110,6.4,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,140,0,162.6,98,206.2,109,141.6,66,8.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,183,0,190.0,100,246.6,78,304.2,107,9.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,63,25,190.0,137,116.6,76,141.5,110,12.2,2,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,99,31,244.1,71,203.4,58,234.0,115,7.7,4,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,154,0,350.8,75,216.5,94,253.9,100,10.1,9,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,159,0,114.8,98,192.6,101,259.0,108,12.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,80,0,149.8,123,276.3,75,241.4,75,10.9,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,105,0,119.3,82,185.1,111,157.0,74,10.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,52,31,142.1,77,193.0,97,253.4,88,11.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,64,0,206.2,76,232.4,76,251.6,96,13.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,189,0,227.4,84,176.0,81,206.1,120,6.3,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,102,0,200.6,106,152.5,127,199.4,128,7.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,86,0,125.5,139,269.8,93,235.8,110,8.9,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,0,61.9,78,262.6,114,212.5,110,8.8,2,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,123,0,194.0,118,242.0,114,146.3,108,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,0,86.1,100,259.8,113,148.0,79,9.1,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,0,220.2,108,188.4,124,172.7,113,11.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,0,247.8,117,130.0,95,134.3,125,6.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,0,133.5,51,219.6,96,210.0,74,11.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,85,29,144.6,97,140.0,102,165.4,148,10.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,141,0,185.1,126,233.0,98,152.2,106,9.1,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,183,0,108.3,87,183.6,116,176.6,109,13.5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,92,0,173.1,140,240.3,105,233.2,117,9.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,48,0,190.4,92,317.5,85,133.4,113,8.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,21,160.6,85,223.1,79,124.0,92,9.5,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,170,42,199.5,119,135.0,90,184.6,49,10.9,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,101,9,160.1,116,210.0,121,139.1,65,10.8,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,0,1 +1,110,0,293.3,79,188.5,90,266.9,91,14.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,108,41,171.6,110,136.1,78,183.4,103,10.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,123,0,236.2,135,273.9,88,227.0,77,10.1,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,22,263.8,65,103.4,115,208.1,109,8.5,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,90,27,156.7,51,236.5,118,123.2,111,12.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,87,0,115.4,90,262.6,68,245.7,69,13.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,162,0,70.7,108,157.5,87,154.8,82,9.1,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,102,0,273.2,85,211.1,82,203.7,129,13.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,112,0,206.2,122,164.5,94,140.3,101,12.6,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,78,0,220.0,95,179.9,121,188.2,109,11.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +1,124,0,312.0,112,180.0,109,168.6,94,12.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,75,42,248.9,93,170.8,108,104.5,91,11.2,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,152,0,140.5,92,186.8,96,227.0,89,9.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,127,36,183.2,117,126.8,76,263.3,71,11.2,8,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,99,0,190.4,102,158.1,107,271.5,92,11.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,50,0,131.1,129,160.5,94,206.9,88,5.6,9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,98,0,169.9,77,138.3,155,142.6,105,8.5,7,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,59,31,225.0,78,191.3,79,226.7,79,9.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,45,0,112.8,108,218.8,120,240.2,106,9.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,106,0,158.7,74,64.3,139,198.5,103,10.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,67,0,176.2,120,236.0,138,152.5,104,10.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,3,27,67.4,116,244.0,78,281.1,93,11.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,96,23,183.1,88,147.4,89,350.2,108,11.3,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,55,45,130.5,114,208.4,94,141.6,114,11.0,5,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,98,36,168.0,81,163.2,125,172.7,120,8.0,2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,119,0,176.8,90,224.7,81,204.6,77,7.5,15,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,127,0,245.2,91,217.2,92,243.1,128,13.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,114,0,169.6,85,58.9,86,179.3,124,7.4,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,92,25,134.0,112,206.0,111,180.6,118,9.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,36,30,146.3,128,162.5,80,129.3,109,14.5,6,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,122,33,174.9,103,248.2,105,164.6,116,13.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,88,0,183.5,93,170.5,80,193.8,88,8.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,175.9,105,188.3,88,188.3,98,11.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,78,0,190.3,88,194.5,89,256.5,109,11.7,5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,44,0,62.3,92,275.0,82,138.7,108,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,1,0 +0,128,29,179.3,104,225.9,86,323.0,78,8.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,1 +0,134,0,7.8,86,171.4,100,186.5,80,12.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,0,201.1,101,170.7,86,237.4,113,11.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,102,0,224.7,81,129.4,112,167.6,109,15.8,6,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,163,22,215.1,91,138.9,102,146.2,109,12.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,112,0,261.4,108,154.5,102,130.9,90,11.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,135,0,151.7,82,119.0,105,180.0,100,10.5,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,61,15,252.4,106,187.8,69,259.6,137,10.0,3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,123,28,124.7,105,250.4,78,216.4,128,7.8,8,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,122,0,140.1,120,231.4,128,188.1,127,11.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,100,0,142.5,87,195.7,88,122.1,117,7.8,8,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,106,0,187.1,104,250.2,117,144.9,81,11.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,75,0,305.1,106,188.0,115,235.4,116,8.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,95,0,128.6,115,216.2,88,255.3,96,6.3,2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,180,0,143.5,121,189.3,111,174.9,82,8.8,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,25,216.0,140,224.1,69,267.9,112,11.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,137,0,135.1,95,134.1,102,223.1,81,12.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,17,31,153.1,115,185.9,59,224.3,102,10.0,1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,179,0,287.3,123,288.0,114,266.0,112,10.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,62,0,159.9,100,172.2,99,263.2,109,5.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,118,0,205.2,115,184.8,137,176.1,115,7.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,0,158.6,104,211.2,77,179.3,104,10.2,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,157,0,240.2,67,153.0,98,249.0,72,10.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,84,38,193.0,106,153.6,106,260.4,87,7.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,92,45,281.1,88,198.0,103,94.3,76,7.5,3,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,88,0,55.6,65,242.7,121,176.3,134,11.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,0,183.1,99,160.1,107,311.8,121,7.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,100,0,159.9,94,179.9,95,154.4,102,11.6,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,107,0,157.1,79,162.6,124,150.0,138,12.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,166,0,197.9,89,251.0,113,138.3,85,11.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,122,40,216.4,80,249.7,90,185.9,99,12.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,114,19,154.6,100,241.6,109,160.0,112,12.6,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,67,20,230.6,40,189.1,58,162.2,115,9.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,147,0,212.8,79,204.1,91,156.2,113,10.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,130,0,203.9,63,191.8,93,132.5,125,12.1,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,47,28,172.9,109,137.6,94,203.8,109,8.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,3,0,185.0,120,203.7,129,170.5,89,14.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,28,283.1,93,185.4,98,312.8,78,6.1,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,152,0,161.4,84,163.6,88,153.2,121,11.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,105,0,213.4,100,204.9,52,179.7,93,9.5,6,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,113,30,183.8,102,183.4,123,235.0,52,11.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,61,0,197.3,67,264.5,106,210.5,116,9.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,62,0,159.7,86,197.5,76,121.6,105,13.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,63,0,211.7,107,271.7,77,203.3,108,7.4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,27,0,232.1,81,210.8,101,165.4,87,15.0,6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,66,26,254.9,108,243.2,135,190.8,95,5.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,124,0,178.3,102,235.0,120,239.7,119,10.9,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,63,0,211.2,80,237.7,93,259.2,58,12.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,0,109.0,69,265.8,98,228.3,80,12.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,122,0,232.5,96,205.5,120,213.7,91,11.9,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,143,0,155.5,101,213.4,89,237.9,61,7.6,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,71,31,115.4,90,217.4,78,239.9,102,13.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,48,22,152.0,63,258.8,131,263.2,109,15.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,94,0,234.4,103,279.3,109,234.2,121,2.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,143,0,202.8,109,165.8,104,143.9,71,4.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,15,0,141.4,80,123.9,76,323.5,88,8.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,109,0,184.0,120,120.4,119,153.7,86,11.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +1,113,0,209.4,151,347.3,113,246.0,116,7.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,168,22,175.9,70,211.7,105,174.5,81,7.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,94,0,181.8,85,202.4,98,245.9,97,9.2,2,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,57,0,221.1,101,236.7,65,252.3,137,9.5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,44,0,288.1,112,258.0,92,192.4,90,10.2,4,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,74,27,154.1,122,195.3,150,276.7,86,13.2,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,151,0,175.3,106,144.3,87,160.2,88,11.8,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,196,0,133.1,80,206.5,120,221.6,96,10.3,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,36,190.3,115,256.6,78,214.9,145,3.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,136,0,109.4,91,207.5,111,135.0,107,11.6,5,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,95,0,183.4,98,281.3,95,105.2,113,8.2,8,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,100,0,96.5,86,210.2,133,146.4,106,12.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,155,21,195.9,91,213.9,84,88.2,111,8.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,171,25,223.2,77,183.2,118,150.8,90,10.2,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,43,0,230.2,147,186.7,121,128.4,100,9.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,131,36,214.2,115,161.7,117,264.7,102,9.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,174,0,139.4,96,143.4,108,225.2,107,10.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,94,0,243.2,109,147.0,88,94.9,99,7.2,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,168,42,97.4,57,203.6,98,173.9,124,11.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,72,29,139.8,114,138.2,91,221.0,88,5.5,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1 +0,62,0,120.7,70,307.2,76,203.0,99,13.1,6,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,261.9,113,148.1,99,145.2,74,13.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,44,0,308.6,139,150.8,94,198.7,66,7.3,3,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,234.8,85,140.9,91,204.3,93,9.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,74,0,298.1,112,201.3,100,214.7,88,9.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,142,26,220.5,94,239.5,126,254.3,109,5.9,9,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,201,21,192.0,97,239.1,81,116.1,125,15.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,128,32,223.5,81,188.8,74,154.9,101,9.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +1,123,0,125.5,106,128.9,96,251.9,129,6.3,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,154,0,145.9,69,208.2,141,180.9,106,14.4,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,104,0,234.2,128,293.1,92,183.9,79,9.8,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,90,0,76.1,121,290.3,73,236.9,89,10.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,0,184.2,95,181.6,101,143.4,113,12.8,4,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,126,0,190.9,143,149.7,72,191.4,87,13.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,69,0,227.0,122,258.7,111,169.7,87,8.9,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,112,0,266.0,97,214.6,94,306.2,100,14.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,259.8,85,242.3,117,168.8,72,5.4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,25,170.7,88,109.9,113,165.7,99,8.7,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,103,0,174.7,151,148.0,56,168.2,109,15.8,3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,56,0,150.9,79,161.8,87,167.7,115,11.7,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,97,0,215.3,58,242.4,91,279.8,105,12.1,9,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,118,26,170.8,114,199.5,125,169.7,98,9.6,5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,84,42,214.3,112,188.2,107,333.5,117,11.3,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,1 +1,84,0,289.1,100,233.8,97,223.5,148,12.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,135,0,144.1,115,249.8,68,211.4,82,13.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,143,0,110.1,113,169.0,59,166.7,94,9.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,116,0,229.3,93,184.5,111,168.2,91,8.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,58,0,243.1,105,231.4,108,180.9,120,7.8,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,8,36,242.9,67,170.9,59,177.3,130,4.8,12,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,156,0,277.0,119,238.3,106,94.4,96,8.3,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +1,159,0,189.1,105,246.1,147,242.0,106,10.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,136.1,120,204.2,103,228.2,90,11.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,47,37,112.8,150,243.9,97,178.7,112,13.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,83,0,337.4,120,227.4,116,153.9,114,15.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,1,0,182.1,106,134.9,106,152.3,75,10.0,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,15,160.0,95,209.5,110,82.3,107,8.7,5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,63,0,62.9,112,202.9,111,259.0,58,8.9,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,141,0,160.1,87,256.7,120,270.0,107,7.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,145,0,199.2,124,126.0,86,289.2,135,7.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,141,0,51.9,108,162.0,83,223.5,115,10.1,3,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,122,0,35.1,62,180.8,89,251.6,58,12.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,120,0,202.0,123,184.3,78,176.0,89,7.4,2,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,101,0,183.9,115,255.9,101,275.0,145,10.8,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,39,0,93.3,83,199.6,114,206.2,104,6.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,128,0,179.4,94,270.4,92,191.0,88,7.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,68,0,222.8,99,175.8,85,202.0,111,11.0,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,5,0,199.2,106,187.3,12,214.0,85,13.3,3,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,34,192.3,114,129.3,114,136.3,102,6.3,12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,66,0,208.7,84,173.3,88,264.7,107,8.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,1,0 +0,95,0,142.5,109,176.1,107,189.6,88,8.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,127,0,180.9,114,209.5,118,249.9,105,7.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,130,0,139.1,72,246.0,112,207.2,121,11.4,9,5,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,132,0,197.8,66,133.9,119,177.3,94,10.9,3,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,152,20,239.1,105,209.1,111,268.2,130,13.3,3,5,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,86,32,70.9,163,166.7,121,244.9,105,11.1,5,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,53,0,184.8,98,216.4,125,141.1,116,18.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,150,0,146.3,133,202.7,95,234.7,103,13.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,61.3,91,194.4,94,143.1,80,11.4,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,46,0,214.1,72,164.4,104,177.5,113,8.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,40,104.9,65,216.3,93,217.4,128,9.6,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,19,34,156.6,97,224.2,97,260.9,135,11.3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,113,0,156.0,141,256.8,72,175.3,123,11.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,81,0,175.5,67,249.3,85,270.2,98,10.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,88,0,161.5,92,173.5,108,206.2,95,7.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,223.4,98,220.6,101,203.9,118,6.3,6,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,118,39,91.5,125,219.9,113,229.0,99,12.7,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,106,0,193.6,66,238.2,82,176.4,107,12.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +1,71,0,278.9,110,190.2,67,255.2,84,11.7,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,0,136.1,112,272.9,96,220.2,104,4.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,71,0,290.4,108,253.9,92,263.3,126,10.1,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,148,26,158.7,91,160.5,127,218.3,88,9.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,122,0,180.0,88,145.0,77,233.7,120,11.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,58,39,211.9,40,274.4,76,210.5,139,5.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,0,1 +0,80,0,118.1,90,144.3,77,225.1,86,8.2,6,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,57,0,115.0,65,122.3,96,245.0,75,6.4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,91,0,134.7,116,295.3,98,195.5,121,6.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,13,0,146.4,74,148.5,92,216.7,96,11.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,67,0,201.4,101,97.6,122,202.5,119,7.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,63,0,58.9,125,169.6,59,211.4,88,9.4,3,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,119,24,217.2,94,138.7,52,139.3,85,11.3,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,110,38,236.3,102,195.9,112,183.5,82,9.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,77,23,209.7,73,183.6,63,205.5,111,7.1,3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,64,0,219.2,73,167.0,65,161.4,119,10.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,105,34,218.5,61,196.7,74,151.1,103,9.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,129,0,98.0,99,240.7,62,254.8,123,10.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,59,32,211.9,120,202.9,136,213.5,95,8.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,86,28,221.6,74,288.4,100,240.3,105,9.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,110,0,227.7,88,170.0,96,128.7,57,11.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,38,204.2,57,205.9,92,286.5,80,8.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,106,0,191.4,124,200.7,116,230.1,76,8.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,22,166.0,114,174.5,103,244.9,68,10.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,155,30,128.5,86,188.4,91,254.4,85,6.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,75,0,147.5,110,191.7,97,135.0,68,16.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,143,33,239.2,109,235.5,112,156.3,95,9.5,4,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,109,0,200.1,72,300.9,120,236.0,68,11.9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,30,186.2,117,286.7,76,164.3,113,12.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,67,0,129.0,78,188.0,116,235.0,102,11.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,0,150.3,101,255.9,112,136.7,62,12.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,81,0,203.5,89,289.6,69,212.9,71,8.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,73,0,157.1,109,268.8,83,181.5,91,10.0,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,66,0,205.1,102,232.7,109,259.9,95,9.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,36,16,149.4,111,131.8,113,132.7,87,6.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,149,0,196.3,108,136.8,96,154.7,87,7.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,92,0,252.3,120,207.0,112,284.6,95,12.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,1,0 +0,148,0,216.2,95,185.7,105,300.0,143,10.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,116,0,173.5,93,194.1,76,208.0,112,16.2,10,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,78,0,103.5,115,117.9,102,201.0,94,12.0,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,167.5,96,139.1,104,138.4,87,13.0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,192.6,102,178.9,118,214.6,74,9.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,61,0,234.2,76,216.7,108,130.6,122,13.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,76,0,173.2,93,131.2,80,170.9,104,5.4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,0,249.4,118,211.5,95,169.0,116,9.1,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,115,0,245.2,105,159.0,109,229.9,74,7.2,8,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,84,35,207.5,138,201.0,116,164.5,107,7.5,16,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,110,0,135.1,109,205.2,99,166.3,119,11.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,255.3,114,194.6,83,276.6,78,3.7,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,209,0,153.7,105,188.6,87,200.8,95,10.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,32,138.1,91,167.3,72,238.9,115,6.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,0,1 +0,43,0,179.3,97,252.7,126,227.5,114,8.0,5,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,21,0,223.2,142,216.5,114,214.7,111,12.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,74,0,282.5,114,219.9,48,170.0,115,9.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,62,0,86.3,84,238.7,99,238.4,79,12.5,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,84,0,190.2,102,197.7,141,247.5,102,9.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,84,0,138.6,102,199.0,93,204.1,137,7.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,76,0,246.8,110,206.3,63,208.4,123,13.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,1,0 +0,127,0,224.3,112,185.7,103,159.4,83,10.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,112,0,243.4,77,182.1,97,259.2,94,12.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,149,43,206.7,79,174.6,122,241.5,80,10.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,148,38,209.2,110,116.6,73,109.6,105,16.5,4,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,105,0,106.4,71,240.1,83,147.7,114,5.3,4,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,0,62.8,124,170.4,66,280.2,78,9.4,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,156,0,123.7,96,103.0,80,189.4,82,13.1,4,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,148,21,262.9,135,149.5,96,140.5,109,8.1,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,73,31,82.3,105,256.1,91,229.6,98,11.8,2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,27,0,82.6,105,204.0,99,224.2,122,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,95,0,157.9,103,259.6,90,230.0,117,14.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,62,0,100.0,98,173.5,95,218.0,122,10.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,78,0,193.1,85,172.1,105,129.6,119,10.2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,100,0,188.5,152,148.3,115,179.8,88,15.2,5,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,22,172.1,119,223.6,133,150.0,94,13.9,20,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,93,0,164.9,68,210.4,86,229.4,104,7.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,124,0,251.5,85,214.2,98,186.1,71,11.1,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,47,30,196.6,93,241.4,140,226.0,118,12.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,48,0,149.2,146,161.9,109,197.9,109,8.3,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,92,0,197.0,84,269.3,105,158.9,105,10.8,4,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,0,129.4,84,157.3,89,215.5,77,13.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,197.6,83,164.5,86,94.0,98,6.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,202.0,126,163.5,86,195.4,84,10.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,0,212.4,105,224.6,118,221.3,105,9.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,78,0,152.9,81,256.6,82,173.6,112,5.3,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,195,36,231.7,110,225.1,88,201.7,89,12.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,1 +0,170,0,184.1,106,204.9,70,224.3,133,9.8,3,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,99,0,197.2,127,156.0,92,204.1,99,9.9,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,126,0,197.6,126,246.5,112,285.3,104,12.5,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,0,1,1,0 +0,90,0,261.8,128,220.6,104,136.6,91,9.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,79,0,103.5,134,319.3,111,239.9,124,8.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,38,31,197.2,118,249.9,70,298.9,104,3.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,131,34,156.6,134,71.0,95,261.7,120,13.4,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,1 +0,46,0,156.4,105,185.5,98,226.7,96,11.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,130,0,150.4,119,230.5,99,186.3,76,12.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,187.2,127,195.6,88,181.8,129,5.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,80,0,220.0,114,207.7,76,168.4,137,12.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,147,0,205.3,95,166.7,128,240.6,84,7.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,145,0,229.6,82,138.1,103,250.8,109,3.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,109,0,175.4,125,250.7,87,289.3,74,9.8,9,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,123,0,123.2,104,190.0,117,170.3,95,12.9,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,46,0,210.2,92,227.3,77,200.1,116,13.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,143,24,186.6,69,222.0,116,234.9,138,11.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,120,23,221.9,114,254.7,84,250.5,117,7.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,190,26,116.7,71,145.9,88,175.1,103,9.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,217,0,176.4,115,158.8,128,306.6,107,9.3,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,112,29,198.8,122,238.6,114,289.5,69,11.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +0,90,0,203.4,146,226.7,117,152.4,105,7.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,134,40,142.9,105,88.6,61,290.0,96,10.8,6,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,109,0,162.6,138,154.0,109,209.7,118,11.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,144,0,177.5,93,287.4,75,180.5,118,11.9,3,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,41,0,207.2,138,214.1,83,193.0,105,11.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,150,0,169.2,123,216.8,83,179.4,107,12.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,107,0,123.1,100,158.4,82,256.1,82,9.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,123,0,175.7,78,184.6,96,156.9,92,9.1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,160,0,216.8,77,207.3,117,228.6,117,5.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,103,0,206.5,125,180.2,113,220.6,95,12.2,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,131,0,187.9,110,200.5,101,202.6,125,10.2,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,88,0,301.5,136,257.7,72,132.9,118,13.4,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,178.4,143,247.0,123,259.9,105,9.6,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,0,177.9,83,167.3,84,223.7,142,15.2,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,179.2,77,210.7,99,276.9,58,9.2,6,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,27,0,72.7,75,208.6,117,65.8,71,9.9,3,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,57,33,193.4,105,231.6,79,226.2,90,11.1,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,7,30,221.4,114,165.8,116,247.0,105,10.8,12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,90,39,94.8,89,219.1,91,197.4,65,11.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,1 +1,106,0,210.6,96,249.2,85,191.4,88,12.4,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,110,0,159.5,145,202.3,101,256.0,96,16.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,121,35,193.8,62,197.6,97,218.8,95,5.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,125,0,187.3,118,160.7,111,263.8,112,9.6,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,61,25,163.7,78,113.2,112,134.1,118,9.9,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,92,0,255.8,125,142.7,111,181.2,101,11.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,93,0,190.7,114,218.2,111,129.6,121,8.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +1,107,0,294.9,71,192.8,78,148.1,87,13.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,204,0,205.2,145,154.8,95,191.4,77,14.1,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,83,0,132.4,120,121.6,101,197.7,84,8.6,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,19,0,201.5,123,129.2,110,220.6,98,12.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,25,265.1,110,197.4,99,244.7,91,10.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,65,31,282.3,70,152.0,89,225.5,93,12.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,146,0,111.1,126,313.4,95,215.7,82,10.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,1,0 +0,32,31,232.8,97,183.5,111,206.8,111,13.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,83,0,271.5,87,216.3,126,121.1,105,11.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,66,36,106.7,76,209.8,77,190.4,117,12.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,121,0,207.9,98,210.5,96,109.6,114,7.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,119,26,132.0,100,173.3,121,203.5,108,11.6,5,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,185,19,157.3,123,257.7,94,190.4,107,9.6,6,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,97,0,239.8,125,214.8,111,143.3,81,8.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,107,0,146.9,94,114.3,111,114.5,97,11.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,102,0,144.4,87,266.5,128,217.6,59,7.1,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,205.2,97,240.6,77,79.7,108,14.4,12,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,113,0,72.5,88,204.0,112,117.9,118,6.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,97,43,121.1,105,260.2,115,222.4,100,8.3,5,3,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,59,0,182.5,104,204.7,95,229.9,100,11.3,8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,126,0,103.7,93,127.0,107,329.3,66,14.4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,0,218.6,93,149.9,130,204.6,131,9.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,104,0,280.4,127,179.4,79,150.6,77,15.2,6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,0,161.5,121,192.9,137,168.3,96,11.2,13,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,1,0 +0,117,0,102.8,119,206.7,91,299.0,105,10.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,35,0,124.2,102,123.9,115,135.7,100,13.1,8,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,73,0,137.1,102,210.8,114,191.4,120,11.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,0,213.4,111,234.5,94,250.1,123,2.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,145.6,103,197.1,137,294.5,83,10.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,93,21,134.2,105,162.5,128,186.6,90,11.8,2,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,115,0,122.0,110,220.2,100,179.7,124,10.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,103,31,185.4,105,197.6,126,147.1,110,14.5,4,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,87,28,143.5,106,223.5,147,175.4,69,11.2,5,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,138,0,170.5,87,118.2,116,187.9,111,11.2,7,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,46,0,139.4,81,223.7,113,173.1,77,13.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,149,0,207.3,115,198.4,82,114.1,83,8.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,84,0,86.0,83,260.7,86,98.6,109,8.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,97,24,133.2,135,217.2,58,70.6,79,11.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,151,0,235.9,104,80.6,91,212.8,116,5.8,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,147,0,183.8,113,164.7,110,111.0,87,10.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +1,133,0,254.7,103,252.2,80,178.1,103,8.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,64,27,201.3,101,143.8,89,150.2,127,12.3,3,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,127,0,221.0,100,160.7,113,233.1,96,6.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,27,213.0,121,226.2,101,189.8,99,11.1,3,4,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,38,36,115.4,98,166.2,83,184.7,79,15.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,61,0,267.1,104,180.4,131,230.6,106,17.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,56,0,164.3,92,233.7,107,187.3,104,11.8,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,49,0,266.3,90,207.8,117,205.0,98,14.0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,169.3,90,156.0,138,210.8,106,11.6,6,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,92,29,201.3,130,203.7,115,129.9,113,6.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,98,0,162.8,65,185.0,109,219.5,104,6.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,214.8,112,209.7,104,164.4,97,9.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,176,23,283.2,130,162.6,74,177.7,104,7.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,95,0,58.2,96,202.1,126,210.5,97,10.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,115,0,210.6,120,153.1,84,262.2,79,11.0,5,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,151.8,106,138.0,126,233.5,112,11.2,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,68,29,239.5,82,203.8,105,167.8,70,9.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,97,15,117.6,97,196.3,126,157.4,113,6.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,182,0,279.1,124,180.5,108,217.5,104,9.5,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,39,82.6,113,224.4,63,163.6,88,9.5,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,120,0,131.7,99,163.1,109,201.1,116,10.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,54,0,272.6,83,248.7,74,197.4,111,9.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,57,37,201.2,76,280.1,122,154.2,110,11.8,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,93,32,218.7,117,115.0,61,192.7,85,9.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,1 +0,34,0,124.8,82,282.2,98,311.5,78,10.0,4,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,58,0,165.4,100,115.7,87,193.8,118,12.8,5,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,0,201.9,74,226.8,119,217.5,80,13.7,6,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,55,28,105.3,82,197.4,109,187.5,91,8.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,57,0,154.2,78,196.7,85,253.5,97,10.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,122,0,231.2,141,267.8,136,240.3,100,8.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,106,29,83.6,131,203.9,131,229.5,73,8.1,3,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,75,0,314.6,102,169.8,86,285.1,100,5.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,102,0,102.6,89,246.0,77,170.5,140,9.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,29,213.6,127,175.9,82,207.2,100,8.9,3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,122,0,146.3,117,218.7,93,236.0,97,11.5,5,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,57,17,236.5,94,163.1,94,236.7,117,12.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1 +0,109,46,217.5,123,233.7,84,163.9,99,9.0,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,114,34,154.4,109,221.4,142,208.5,103,10.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,70,30,143.4,72,170.0,92,127.9,68,9.4,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,193,0,170.9,124,132.3,95,112.9,89,11.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,62,0,172.4,132,230.5,100,228.2,109,11.0,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,53,0,57.5,95,265.5,131,244.3,128,11.6,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,31,225.2,89,256.8,117,249.7,87,11.5,1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,93,0,146.3,85,216.6,95,233.0,82,11.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,0,197.0,88,190.4,68,211.9,104,16.1,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,94,0,89.5,94,339.9,106,172.9,76,7.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,164,0,123.3,78,170.0,85,165.9,78,12.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,159,0,87.7,103,278.2,97,170.6,93,10.5,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,58,43,142.8,96,272.3,100,193.4,105,8.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,67,36,115.6,111,237.7,94,169.9,103,9.9,12,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,133,0,143.8,71,184.0,131,275.5,132,12.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,122,33,270.8,96,220.4,110,169.9,104,11.8,8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,93,0,131.4,78,219.7,106,155.7,103,11.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,81,0,261.4,141,215.7,102,271.8,96,8.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,0,154.4,130,217.2,101,185.4,52,13.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,6,0,183.6,117,256.7,72,178.6,79,10.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,147,0,274.0,92,231.8,82,283.6,83,6.2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,100,0,107.0,63,105.7,67,243.1,74,12.8,3,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,154,0,154.5,122,214.2,71,178.0,105,12.0,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,13,25,176.6,65,172.7,96,104.5,128,11.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,0,1 +1,112,0,174.3,123,140.2,124,215.4,89,9.0,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,22,306.2,123,189.7,83,240.3,107,11.7,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,67,0,215.5,102,190.7,95,214.5,106,8.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,148,0,233.5,81,187.7,71,122.3,97,9.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,56,0,177.7,114,215.6,110,236.7,67,10.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,119,0,154.5,129,193.6,87,180.9,145,13.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,67,0,210.7,116,219.2,86,179.7,83,7.2,6,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,69,0,195.1,91,261.5,57,203.8,90,11.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,105,27,141.2,96,167.7,94,274.4,101,11.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,172,47,274.9,102,186.6,118,245.0,123,8.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,68,30,122.9,93,233.5,91,199.5,144,9.6,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,177,0,227.8,81,161.8,97,217.0,106,8.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,148,25,230.7,102,233.8,109,215.8,90,13.5,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,108,0,210.6,117,164.2,103,201.4,68,9.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,161,0,156.1,114,180.3,63,179.6,115,11.1,9,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,63,36,199.0,110,291.3,111,197.6,92,11.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,163,23,224.0,126,233.5,89,293.9,104,8.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,139,25,138.3,96,80.6,79,163.7,83,8.3,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,93,20,187.5,110,169.8,94,175.3,127,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,83,0,208.9,71,214.8,92,247.9,108,13.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,28,0,236.8,102,167.1,87,280.2,115,9.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,162.0,81,247.5,89,155.5,99,8.9,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,161,0,332.9,67,317.8,97,160.6,128,5.4,9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,160.6,103,237.0,109,245.1,88,10.7,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,79,0,205.7,123,214.5,108,226.1,106,6.7,18,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,99,0,155.3,93,265.7,95,145.7,67,12.4,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,127,25,173.0,91,245.8,64,300.0,99,4.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +0,138,0,87.6,112,266.9,107,214.6,104,9.8,10,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,165,0,216.6,126,190.8,104,224.7,123,12.4,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,56,0,222.7,133,277.0,89,101.8,94,13.6,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,30,173.1,107,247.2,101,158.7,104,11.5,5,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,105,0,162.3,99,212.5,95,214.7,114,11.1,8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,0,197.9,84,168.1,113,239.8,145,12.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,52,0,155.0,110,133.4,104,176.1,84,7.0,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,64,0,225.3,134,108.2,87,139.6,132,17.3,9,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,120,0,179.9,72,170.0,98,190.6,89,13.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,56,0,146.1,57,196.2,97,310.1,110,9.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,147,24,219.9,118,208.5,116,352.5,111,8.1,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,172,0,215.7,140,146.3,84,264.6,83,7.1,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,30,30,217.4,74,213.8,86,227.2,104,6.6,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,18,0,273.6,93,114.6,116,250.6,120,8.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,101,42,209.2,82,159.7,74,181.6,100,9.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,184,0,151.7,93,178.5,77,229.1,111,13.1,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,107,0,167.3,100,163.9,79,185.9,100,6.7,5,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,0,146.4,81,225.1,80,230.1,117,8.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,125,0,206.0,128,198.1,71,135.9,116,13.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,91,0,231.8,120,150.6,106,269.2,129,11.6,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,81,28,167.9,147,190.7,105,193.0,103,9.2,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,54,33,112.0,90,208.0,112,150.3,83,11.3,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,44,0,202.6,89,163.0,96,268.1,151,8.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,0,179.3,93,178.6,98,225.2,131,11.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,42,0,146.3,84,255.9,113,45.0,117,8.0,12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,73,0,122.0,92,138.3,114,224.2,128,5.8,5,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,69,0,195.3,70,216.7,108,259.9,119,12.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,64,0,148.1,73,164.9,101,216.0,125,12.3,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,121,30,198.4,129,75.3,77,181.2,77,5.8,3,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,142,24,239.8,103,285.9,65,256.7,106,9.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1 +0,63,13,214.2,61,181.2,88,174.0,68,10.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +0,117,13,207.6,65,152.7,77,232.8,95,9.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,64,0,236.2,77,218.6,85,194.1,97,13.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +1,92,0,154.0,122,329.8,88,288.0,117,5.6,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,75,38,163.6,132,146.7,113,345.8,115,13.1,3,3,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,113,0,215.9,93,240.1,85,156.7,123,4.9,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,132,36,226.2,103,181.6,125,258.8,102,10.5,5,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,114,36,309.9,90,200.3,89,183.5,105,14.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,81,0,153.5,99,197.6,102,198.5,86,6.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,52,38,169.3,88,225.9,97,172.0,86,8.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,99,0,128.8,86,203.9,105,282.6,131,14.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,103,0,171.7,78,144.5,86,157.9,106,6.8,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,20,35,171.5,98,153.1,127,165.6,125,7.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,63,0,180.5,126,230.0,98,232.5,73,10.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,245.0,112,180.4,91,262.9,105,9.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,74,0,172.1,105,211.7,99,182.2,105,11.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,58,0,247.2,116,303.7,103,105.4,94,9.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,0,190.2,119,157.1,70,181.5,120,14.0,6,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,94,0,252.6,104,169.0,125,170.9,106,11.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,102.7,89,149.3,100,188.1,114,11.0,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,115,0,132.0,90,197.5,75,175.8,114,0.0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,50,24,214.3,129,289.8,55,312.5,130,10.6,4,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,64,37,154.6,92,83.4,103,165.9,99,13.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,123,23,245.0,88,265.0,105,239.7,108,14.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,169,0,100.8,112,230.0,69,193.6,95,9.5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,25,288.5,114,203.4,74,228.4,117,13.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,125,0,168.6,99,175.6,107,243.3,92,10.9,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,50,0,157.1,90,223.3,72,181.4,111,6.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,51.5,90,164.0,98,169.4,80,9.5,4,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,106,0,165.3,118,210.0,101,187.2,93,8.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,113,0,122.2,112,131.7,94,169.5,106,10.3,9,5,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,113,0,150.1,120,200.1,85,266.7,105,11.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,0,157.5,70,130.7,79,193.4,98,9.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,103,0,70.9,134,134.5,112,168.8,164,12.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,166.0,79,74.6,100,247.9,74,6.3,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,82,0,200.3,96,201.2,102,206.1,60,7.1,1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,100,30,58.8,104,219.5,107,152.3,118,7.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,99,0,140.7,88,210.9,98,229.9,125,12.4,4,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,155,0,163.1,94,291.7,108,96.4,111,11.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,100,38,224.7,121,294.0,131,290.0,61,9.8,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +0,66,0,201.3,95,152.8,66,233.2,101,7.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,0,150.7,105,197.3,133,169.0,116,9.2,15,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,161,0,191.9,113,70.9,87,204.8,107,13.4,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,94,0,220.8,111,156.2,67,187.9,89,10.5,4,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,56,0,226.0,112,248.5,118,140.5,142,6.9,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,111,28,128.8,104,157.3,52,147.4,76,10.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,116,0,137.4,126,120.0,94,130.3,64,12.4,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,92,0,196.5,82,190.0,89,163.2,99,10.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,130,0,213.1,105,206.2,108,163.4,93,8.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,39,175.7,93,187.2,94,225.5,118,8.6,3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,82,0,101.0,93,155.6,104,304.4,93,13.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,1,30,183.1,95,232.6,110,248.3,110,8.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,85,0,144.4,88,264.6,105,185.4,94,9.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,144,0,203.5,100,247.6,103,194.3,94,11.9,11,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,74,0,174.1,96,251.1,94,257.6,123,8.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,139,0,134.4,106,211.3,98,193.6,125,10.2,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,103,36,87.2,92,169.3,110,166.7,80,10.9,5,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,53,0,205.1,86,160.5,95,149.5,142,10.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,104,0,156.2,93,193.0,54,222.7,94,13.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,96,31,183.4,126,195.5,106,180.1,93,10.5,5,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,118,0,187.4,97,177.8,89,233.4,97,12.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,101,28,220.3,96,285.8,72,203.0,111,9.4,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,96,0,160.2,117,267.5,67,228.5,68,9.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,42,0,150.7,52,246.7,96,103.8,118,7.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,55.3,102,164.7,124,200.7,108,10.2,5,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,68,0,172.7,95,139.1,90,174.3,99,11.7,1,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,112,0,170.4,103,200.2,71,258.3,100,11.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,115,24,142.1,124,183.4,129,164.8,114,9.6,4,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,157,29,219.2,102,206.0,109,192.4,117,15.0,5,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,69,27,268.8,78,246.6,89,271.9,102,16.4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,45,29,135.8,104,222.5,101,235.6,92,7.9,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,117,0,239.9,84,174.8,106,209.5,93,9.8,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,155,0,190.3,123,301.3,96,214.6,134,8.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,160,0,176.2,90,196.0,115,263.9,95,9.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,122,37,163.0,107,312.8,118,200.0,85,11.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,129,0,101.4,145,249.1,116,157.6,107,7.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,145,0,175.8,89,274.3,119,226.6,69,12.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,23,0,190.2,89,166.4,108,219.8,73,15.0,4,6,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,35,182.8,122,212.7,119,193.8,103,11.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,66,0,146.4,107,196.5,99,230.1,106,7.8,2,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,151,0,170.2,89,187.5,83,119.5,100,4.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,99,0,128.3,78,215.3,120,143.7,140,14.3,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,70,0,59.5,103,257.2,106,208.3,86,11.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,24,25,164.9,110,209.3,105,231.2,55,6.7,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,102,0,233.8,103,221.6,131,146.9,106,12.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,73,0,187.3,118,239.7,90,167.5,108,15.1,2,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,35,0,149.3,113,242.2,122,174.3,104,8.9,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,119,16,147.2,103,160.1,96,184.0,120,7.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,78,0,139.2,140,191.4,113,286.5,125,11.8,3,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,64,43,118.4,100,144.1,108,158.1,91,8.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,105,0,166.1,93,175.9,106,243.5,55,16.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,60,0,135.4,134,205.9,85,204.0,103,7.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,104,23,280.2,136,220.5,92,136.9,102,13.3,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +0,40,0,115.7,105,127.8,113,107.5,91,9.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,132,0,81.1,86,245.2,72,237.0,115,10.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,9,39,214.1,108,169.2,115,189.7,117,10.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,129,34,204.5,79,132.8,113,190.1,117,14.8,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,150,0,136.6,112,209.4,81,161.1,78,12.2,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,181,27,190.3,93,249.0,127,215.7,82,10.6,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,125,0,233.3,65,209.8,93,210.6,109,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,99,0,54.8,92,173.0,103,195.1,125,7.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,112,0,272.5,119,226.1,94,159.1,94,16.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,0,192.0,91,127.6,127,155.6,125,7.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,0,99.5,110,129.1,80,125.1,124,9.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,122,22,204.5,92,139.6,121,205.0,103,8.6,5,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,84,0,146.8,133,171.7,73,234.5,69,9.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,139,0,271.6,130,156.0,131,136.3,108,11.6,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,201,0,212.7,72,225.2,90,195.1,99,7.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,73,0,234.7,102,195.7,110,253.4,71,8.4,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,71,0,243.7,124,60.0,90,189.0,129,11.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,166,0,199.6,93,214.3,99,196.8,110,7.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,165,0,209.4,67,273.8,89,150.2,88,12.8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,99,0,238.4,96,246.5,130,198.4,117,12.4,4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,49,0,119.3,117,215.1,109,178.7,90,11.1,1,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,114,0,84.7,118,249.9,86,193.4,95,14.5,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,66,35,190.8,100,261.3,93,209.5,108,8.9,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,114,30,206.2,79,260.0,91,291.6,83,11.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,65,0,192.0,89,139.5,88,187.4,102,5.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,120,0,203.3,108,259.9,66,115.9,103,7.8,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,203.7,92,216.4,97,154.2,66,7.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,55,20,207.7,91,199.7,113,216.5,110,7.3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,49,0,213.8,79,265.1,93,239.8,128,15.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,33,0,164.0,99,153.1,102,123.8,104,6.4,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,86,0,148.2,71,285.1,91,166.4,155,6.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,109,0,209.1,141,205.0,93,119.4,111,7.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,107,22,281.1,83,143.7,130,239.4,128,11.2,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,80,0,322.3,113,222.0,95,162.8,123,6.7,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,152,0,206.3,98,292.8,82,43.7,121,10.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,108,34,162.1,83,171.8,117,259.8,76,9.6,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,57,0,272.7,74,224.9,85,178.2,104,10.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,52,0,129.3,80,142.7,101,258.3,89,12.3,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,38,0,175.7,109,211.8,97,137.9,109,9.2,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,123,0,140.0,106,153.7,101,50.1,87,12.5,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,0,103.0,129,242.3,103,170.2,89,7.9,3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,31,0,165.4,84,203.7,107,201.7,65,8.2,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,11,0,143.4,130,289.4,50,194.0,100,9.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,84,0,181.5,129,130.7,112,186.5,118,8.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,0,179.1,123,196.6,132,186.7,116,10.2,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,110,0,188.0,127,90.5,118,150.3,64,15.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,127,27,2.6,113,254.0,102,242.7,156,9.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,126,0,175.4,120,98.3,71,201.9,93,10.6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,74,0,125.8,103,207.7,96,207.4,143,14.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,51,0,180.5,88,134.7,102,170.7,97,10.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,0,217.9,71,230.1,116,232.1,110,10.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,41,0,202.9,97,153.8,104,113.5,92,9.0,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +1,108,0,291.6,99,221.1,93,229.2,110,14.0,9,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,96,0,97.6,98,105.5,118,220.2,105,11.6,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,23,120.5,104,227.8,115,158.5,100,10.2,3,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,63,0,142.5,92,208.3,102,228.9,120,7.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,108,30,276.6,99,220.1,113,177.9,95,9.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,135,0,194.8,97,235.3,118,174.4,126,11.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,113,0,81.3,116,220.6,124,235.7,113,8.9,3,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,51,0,227.2,89,194.4,106,243.4,126,14.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,130,0,120.5,127,189.7,52,270.1,107,14.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,0,232.4,109,187.4,95,231.2,107,9.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,133,0,127.3,108,251.3,81,135.0,88,10.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,161,0,72.8,120,267.1,120,222.5,91,11.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,221.2,80,213.6,104,291.8,89,11.9,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,63,0,153.5,81,287.3,115,230.2,85,6.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,45,38,196.8,92,254.2,108,261.8,85,7.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,152,0,197.1,126,130.1,76,78.1,100,7.4,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,84,0,130.2,105,278.0,60,305.4,74,14.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,124.1,82,202.6,120,289.6,119,6.7,8,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,154,0,191.4,93,205.4,119,205.7,121,10.2,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,0,176.7,132,244.1,80,176.3,120,9.1,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,52,0,215.9,67,217.0,108,342.8,130,5.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,161,0,196.6,73,170.2,79,194.3,79,12.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,54,0,247.5,85,225.4,93,244.3,132,10.2,2,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,134.0,104,174.5,94,311.1,79,7.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +1,76,0,299.5,125,226.7,92,210.7,134,13.7,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,133,0,227.4,90,73.2,135,114.3,99,4.7,7,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,139,0,196.0,135,186.0,146,153.0,92,9.8,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,88,0,65.4,97,168.2,76,236.0,113,13.8,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,93,0,98.4,78,249.6,129,248.2,114,14.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,68,0,143.6,80,134.3,65,215.6,84,15.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,110,0,208.0,69,95.1,94,178.5,129,8.0,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,112,36,113.7,117,157.5,82,177.6,118,10.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,70,0,148.4,110,267.1,90,151.5,101,8.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,125,34,268.4,112,222.2,108,117.6,102,10.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,131,0,166.5,129,210.2,107,257.2,93,9.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,0,165.0,100,317.2,83,119.2,86,8.3,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,76,26,214.6,110,205.2,87,134.6,140,8.1,2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,64,27,182.1,91,169.7,98,164.7,86,10.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,101,0,217.7,118,231.7,128,185.3,128,0.0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,0,188.8,60,217.4,64,220.1,100,8.2,7,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,119,32,173.0,101,209.4,93,231.1,91,12.2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,124,0,178.4,72,233.6,134,179.4,91,12.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,37,0,221.0,126,204.5,110,118.0,98,6.8,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,108,0,115.1,114,211.3,70,136.1,85,13.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,148,0,17.6,121,161.7,125,203.1,82,10.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,90,0,214.9,97,117.8,117,133.7,78,11.8,2,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,141,0,217.1,110,241.5,111,253.5,103,12.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,179,0,170.7,54,191.1,108,214.6,107,13.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,32,0,171.2,82,185.6,102,203.3,64,10.2,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +1,60,29,265.9,113,215.8,94,108.1,82,14.0,12,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,130,0,132.4,81,200.3,110,202.5,103,6.0,1,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,0,115.9,87,111.3,56,170.2,77,7.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,70,0,129.9,102,208.7,133,231.4,93,14.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,91,27,204.6,96,136.0,93,210.5,82,6.6,2,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,192,0,185.0,88,224.9,98,212.4,105,11.4,3,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,0,153.6,92,205.5,88,114.5,89,12.5,10,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,134,50,208.8,130,132.9,104,136.7,107,11.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,75,0,203.3,70,228.9,97,222.2,118,14.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,16,0,229.6,78,205.7,108,166.2,91,10.8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,61,29,128.2,119,171.7,83,250.9,114,11.7,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,165,0,150.5,75,193.1,93,311.6,93,10.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,119,0,94.2,108,264.1,100,203.7,79,7.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,93,0,239.8,70,251.8,99,168.6,112,10.9,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,162.6,83,152.3,109,57.5,122,14.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,103,0,185.0,117,223.3,94,222.8,91,12.6,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,102,0,174.5,73,213.7,114,164.7,116,10.3,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,119,32,142.6,77,208.2,126,171.0,102,12.0,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,52,20,133.3,63,184.1,123,272.9,107,13.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,89,0,89.7,80,179.8,81,145.7,120,9.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,89,0,326.3,112,165.1,110,162.9,97,7.5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,149,20,198.9,77,274.0,88,190.7,76,14.3,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,167,0,169.2,124,173.3,108,216.5,64,12.4,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,83,0,231.3,100,210.4,84,217.4,106,12.4,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,0,267.4,78,204.2,85,111.7,146,5.9,4,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,107,25,248.6,91,119.3,115,194.3,83,12.0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +0,133,0,221.1,133,160.2,140,161.8,84,8.4,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,99,0,169.2,70,271.5,77,170.2,104,10.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,144,0,133.3,101,255.5,127,228.6,68,11.6,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,12.5,67,256.6,90,169.4,88,7.7,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,107,14,114.3,132,199.8,91,194.7,74,7.5,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,105,0,147.7,103,222.7,78,163.5,102,12.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,161,0,173.4,100,213.7,74,141.5,69,11.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,108,0,240.2,78,230.3,109,217.0,83,5.2,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,0,232.6,96,253.4,117,154.0,101,10.5,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,46,0,199.2,111,175.1,83,210.6,84,10.2,2,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,134,32,216.8,78,102.2,111,174.0,83,8.6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,117,0,143.3,103,211.3,108,185.2,96,11.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,163,40,231.9,56,211.8,91,268.5,74,12.3,3,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,77,0,233.8,104,266.5,94,212.7,104,7.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,109,0,175.6,80,238.0,94,198.4,103,10.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,94,0,271.2,105,202.6,105,221.6,51,11.5,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,87,0,185.8,119,192.3,83,200.0,96,6.6,4,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,33,209.6,68,146.9,140,121.0,131,10.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,125,0,137.1,94,209.8,83,238.4,114,8.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,112,0,111.9,92,114.0,143,146.8,79,14.1,3,5,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,119,0,124.3,68,207.1,88,157.4,93,14.8,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,110,0,271.1,108,237.0,122,239.9,122,9.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,92,0,62.6,111,180.6,126,221.7,80,10.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,144,35,174.8,127,219.6,93,255.8,90,12.8,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,73,0,194.8,112,167.2,85,100.3,61,10.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,54,39,117.6,82,159.2,60,236.4,113,11.3,10,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,77,29,211.1,89,223.5,97,148.4,106,9.7,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,120,28,215.8,123,285.2,76,192.1,78,6.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,0,1 +0,115,0,127.7,67,182.9,90,172.9,92,10.6,7,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,69,0,279.8,90,248.7,91,171.0,118,8.4,10,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,1,0 +0,73,31,194.4,104,176.0,84,230.1,110,11.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,4,0,145.3,89,303.8,93,206.1,82,8.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,90,0,148.2,96,220.4,111,134.2,97,9.2,1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,1,0 +0,91,0,203.1,106,210.1,113,195.6,129,12.0,3,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,120,0,184.5,103,209.0,86,169.7,70,10.2,6,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,181,0,229.9,130,144.4,93,262.4,110,14.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,28,151.1,90,194.8,79,239.2,114,10.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,122,34,146.4,104,89.7,103,220.0,91,15.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,76,0,169.5,77,124.0,87,219.4,92,10.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,139,31,203.5,82,200.3,72,214.0,112,13.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,119,0,138.3,89,170.5,78,263.9,98,13.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,120,0,212.1,131,209.4,104,167.2,96,5.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,46,0,164.2,116,196.2,153,236.1,119,8.1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,29,0,196.8,81,168.0,110,132.6,98,12.7,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,0,215.4,104,204.8,79,278.5,109,12.6,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,122,29,195.4,83,268.2,93,168.0,95,8.4,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,112,0,161.9,138,200.9,114,134.0,134,10.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,1,0 +0,96,0,144.0,102,224.7,73,227.7,91,10.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,75,28,200.6,96,164.1,111,169.6,153,2.5,5,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,107,37,60.0,102,102.2,80,261.8,106,11.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,142,30,154.0,75,165.8,97,270.0,83,10.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,157,0,220.7,105,119.3,127,165.1,113,11.5,7,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,0,161.0,117,190.9,113,227.7,113,12.1,4,4,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,89,0,197.1,110,165.9,115,227.3,106,12.8,3,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,0,130.2,119,290.9,121,194.8,140,14.0,6,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,86,30,99.9,84,263.5,125,254.7,90,9.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +0,95,0,160.0,133,215.3,98,188.9,87,9.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,149,28,180.7,92,187.8,64,265.5,53,12.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,72,21,186.7,108,335.0,86,187.2,119,16.5,4,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,16,0,161.9,100,230.1,138,148.8,78,10.2,11,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,164,0,192.1,95,249.8,94,132.6,100,7.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,93,0,141.7,95,221.0,100,227.1,71,10.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,155,0,184.6,102,196.0,117,226.5,122,7.8,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,83,0,195.0,92,210.5,83,180.6,92,11.0,13,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,0,204.3,65,247.3,123,214.7,94,12.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,189.7,76,156.1,65,244.0,91,8.3,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,148,0,203.0,92,150.9,125,245.5,131,14.6,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,89,0,89.5,66,179.3,104,225.1,116,12.3,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,101.1,119,214.4,67,179.5,112,10.3,5,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,71,0,211.2,70,252.7,122,225.8,104,12.3,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,47,0,155.3,116,188.2,85,247.0,73,12.3,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,109,0,222.2,113,218.5,122,266.0,88,10.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,170,37,178.1,130,242.8,103,243.0,93,13.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,163,0,197.0,109,202.6,128,206.4,80,9.1,10,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,140,0,149.7,71,212.5,97,245.9,67,12.6,4,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,0,168.6,102,298.0,117,194.7,110,9.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,23,182.0,80,216.1,85,156.9,82,9.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,121,0,213.2,79,120.7,116,244.4,102,7.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,96,40,108.6,90,206.4,154,126.3,118,13.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,137,0,104.7,115,249.8,144,192.3,99,8.9,2,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,67,0,109.1,134,142.3,76,91.2,86,10.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,118,23,289.5,52,166.6,111,119.1,88,9.5,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,130,30,185.0,117,249.5,141,157.8,103,7.4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,124,0,157.8,71,203.2,114,168.7,82,10.0,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,33,0,190.6,100,161.7,104,189.9,136,13.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,101,0,174.9,105,262.0,75,210.0,93,8.5,5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,118,42,148.7,105,167.3,105,270.6,105,10.4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,89,0,125.6,108,213.0,90,181.7,108,5.4,5,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,131,24,135.9,60,233.2,78,210.6,121,9.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,99,0,182.6,83,154.5,111,196.0,57,12.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,55,0,106.1,77,123.5,100,96.4,92,12.9,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,141,28,308.0,123,247.8,128,152.9,103,7.4,3,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,95,0,194.6,114,232.8,106,173.4,92,3.8,2,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,140,0,235.5,81,257.2,130,103.1,111,11.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,181.5,98,199.9,88,287.7,114,6.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,32,247.0,109,125.6,91,226.5,90,10.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,89,32,209.9,113,249.8,104,224.2,92,8.7,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,0,1 +1,144,0,278.5,95,240.7,90,120.0,90,11.6,5,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,13,0,143.1,139,239.6,88,221.7,123,7.1,5,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,11,180.7,82,173.7,90,231.5,89,10.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,0,1 +0,97,0,169.7,84,165.9,86,191.9,83,12.8,6,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,111,0,146.2,55,261.5,83,163.2,116,8.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,74,0,148.5,111,146.5,42,289.2,83,9.9,6,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,72,0,165.9,114,235.9,97,210.1,120,12.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,24,176.1,109,159.4,81,269.1,94,12.1,9,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,169,0,57.1,98,199.7,78,274.7,103,6.5,6,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,210,31,313.8,87,147.7,103,192.7,97,10.1,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,139,43,231.0,85,222.3,82,148.0,105,8.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,37,161.2,109,204.2,79,231.5,87,8.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,108,0,246.2,102,202.4,134,180.1,95,9.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,114,0,202.1,100,195.7,102,291.8,120,13.3,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,216.9,78,211.0,115,179.8,116,11.4,5,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,56,24,121.7,87,184.0,76,266.6,98,12.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,111,0,176.4,62,201.0,124,150.4,138,11.2,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,31,0,166.1,105,79.3,93,213.7,98,12.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,105.0,150,251.6,90,258.0,93,14.9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,93,0,152.1,141,215.5,107,262.4,111,12.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +1,151,0,218.0,57,114.4,88,269.2,95,12.4,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,128,0,148.5,105,243.0,106,255.2,114,6.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,90,26,169.0,104,188.8,104,213.3,76,13.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,151,0,134.5,88,143.1,112,223.9,61,15.4,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,45,0,207.6,71,152.7,94,217.8,125,12.4,13,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,0,124.4,83,179.7,81,253.0,99,11.3,6,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,30,0,169.9,144,225.2,118,169.7,93,11.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,0,159.0,109,255.1,142,82.4,73,10.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,138,0,205.9,96,257.1,94,209.0,63,12.1,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,68,24,176.0,118,277.9,116,174.7,71,14.7,7,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,51,0,169.3,111,139.5,69,197.0,87,12.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,67,0,171.7,80,110.4,81,195.4,111,11.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,134,0,204.7,108,143.1,105,165.8,84,11.0,4,6,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,171.8,106,301.7,44,139.4,108,9.7,5,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,117,20,205.7,98,136.1,107,159.4,147,8.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,63,0,149.3,104,273.6,75,206.6,72,9.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +1,178,0,124.5,134,141.2,78,268.2,113,11.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,163,0,178.7,56,215.7,79,152.7,84,10.6,2,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,92,0,139.8,98,174.9,143,201.6,135,9.4,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,116,27,204.7,118,209.4,91,212.9,67,7.0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,189,38,256.7,98,150.5,120,123.0,87,11.4,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,176,0,250.9,108,171.4,100,148.6,85,9.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,190,22,166.5,93,183.0,92,121.0,102,8.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,98,19,110.5,87,227.8,97,243.6,84,11.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,127,0,239.8,107,128.9,121,249.9,110,11.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,138,0,241.8,93,170.5,83,295.3,104,11.8,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,117,0,158.7,84,181.7,91,177.3,67,7.7,10,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,91,0,246.4,110,182.0,98,157.6,106,12.1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,46,0,196.7,85,205.9,74,216.6,112,11.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,85,27,196.4,139,280.9,90,89.3,75,13.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,53,0,261.2,119,250.8,105,176.0,112,9.8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,155,0,250.8,146,152.5,105,148.1,104,10.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,72,0,175.5,103,132.3,120,242.9,96,11.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,61,33,270.7,53,200.7,116,201.7,102,10.9,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,1,26,146.6,68,172.8,67,173.8,113,10.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,105,0,227.4,121,268.5,89,143.3,82,13.0,4,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,133,0,295.0,141,223.6,101,229.4,109,12.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,74,0,314.1,86,222.4,99,259.0,121,12.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,146.3,108,171.8,102,167.5,66,5.3,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,26,0,234.5,109,216.5,129,191.6,94,3.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,150,0,214.0,117,192.4,89,242.6,99,7.9,4,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,225.2,93,215.1,120,241.8,95,9.1,2,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,169,0,179.2,111,175.2,130,228.6,92,9.9,6,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,21,0,244.7,81,168.0,117,281.5,87,6.6,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,180,0,224.9,105,250.0,101,216.1,73,6.7,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,121,0,144.8,126,200.6,82,208.8,81,13.3,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,30,177.3,95,211.8,102,240.2,108,9.3,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,71,0,186.1,114,198.6,140,206.5,80,13.8,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,126,34,244.9,118,219.6,105,210.8,136,9.7,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1 +0,68,0,131.6,89,137.0,109,256.3,107,10.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,122,0,243.8,98,83.9,72,179.8,84,13.7,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,25,194.6,84,119.9,103,175.5,75,13.1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,75,0,138.5,110,153.2,86,215.6,103,11.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,33,29,157.4,99,117.9,80,279.2,79,13.9,11,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,72,0,196.5,88,158.6,129,269.3,118,6.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,53,0,119.7,113,189.7,84,256.2,108,12.9,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,27,140.1,59,223.4,111,257.9,73,3.8,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,93,0,149.6,120,200.7,85,181.2,107,14.3,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,158,0,209.9,112,221.3,82,210.0,93,8.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,115,0,245.0,97,250.7,75,270.2,124,13.7,8,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,137,0,174.0,123,161.3,115,260.7,98,11.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,82,0,167.1,77,131.8,79,187.4,98,9.4,1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,46,34,191.4,102,361.8,96,147.5,132,7.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,88,0,189.8,111,197.3,101,234.5,111,14.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,35,27,241.7,87,142.0,101,288.9,68,9.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +0,54,0,134.3,73,155.5,100,102.1,68,14.7,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,123,0,257.9,92,211.6,71,189.3,104,9.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,88,0,153.5,94,251.7,118,182.2,99,8.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,87,0,240.0,83,134.1,106,189.1,84,9.3,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,0,274.6,105,161.1,121,194.4,123,9.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,146,0,149.3,83,187.1,130,149.8,100,7.9,4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,56,0,221.9,112,278.2,122,288.1,85,7.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,100,0,218.8,125,148.3,102,277.8,97,9.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,90,0,198.5,124,266.6,100,243.3,80,8.0,7,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,0,168.4,114,276.0,127,196.2,48,11.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,43,0,27.0,117,160.9,97,279.5,96,10.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,124,28,251.4,104,225.1,89,251.9,121,7.5,5,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,90,0,246.4,83,160.3,88,170.9,99,7.6,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,64,0,146.7,83,148.3,91,238.6,69,12.5,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,128,0,227.9,130,302.6,71,191.5,82,5.5,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,140,0,194.8,107,170.9,99,225.1,93,13.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,57,39,213.0,115,191.1,112,182.7,115,9.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,1 +0,99,0,241.1,72,155.6,98,188.2,109,11.6,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,161,0,107.5,121,256.4,46,247.2,131,12.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,22,0,207.7,116,210.6,99,238.2,88,9.6,5,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,70,0,175.4,130,159.5,130,260.6,96,11.6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,87,0,189.5,113,204.9,100,221.7,93,13.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,106,0,147.9,97,209.3,99,162.1,80,8.8,5,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,29,157.2,118,196.3,136,226.7,109,8.4,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,106,0,204.0,84,168.5,61,164.0,102,13.3,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,57,0,85.9,92,193.9,127,231.5,93,10.1,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,0,252.4,74,167.9,81,248.3,110,10.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,149,0,242.5,83,245.4,97,219.6,80,10.0,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,31,104.9,115,237.6,125,263.4,104,7.7,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,120,0,180.0,80,224.2,82,265.4,91,4.7,7,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,81,0,149.4,68,171.9,98,214.5,97,17.9,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,87,0,167.3,119,198.5,119,133.1,88,11.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,74,31,249.4,70,209.5,59,180.6,75,9.9,2,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,99,0,159.7,83,155.4,121,255.7,114,8.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,144,0,61.6,117,77.1,85,173.0,99,8.2,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,100,0,166.0,102,236.1,97,134.3,93,10.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,141,36,187.5,99,241.4,116,229.5,105,10.5,5,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,32,0,164.8,98,229.9,96,167.3,108,14.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,55,0,191.9,91,256.1,110,203.7,101,14.3,6,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,143,0,194.3,99,123.6,133,229.5,99,10.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,74,0,136.7,106,228.6,105,265.3,114,9.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,40,236.5,111,117.0,110,221.1,115,8.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,89,0,206.9,134,167.7,105,155.7,86,10.9,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,155,0,165.4,108,183.7,103,80.2,108,8.9,4,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,119,19,178.1,110,212.8,100,226.3,123,10.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,100,0,160.3,138,221.3,92,150.4,120,11.2,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,111,0,197.1,117,227.8,128,214.0,101,9.3,11,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,181,40,105.2,61,341.3,79,165.7,97,6.3,3,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,95,36,283.1,112,286.2,86,261.7,129,11.3,3,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,104,0,113.6,87,158.6,98,187.7,87,10.5,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,70,0,232.1,122,292.3,112,201.2,112,0.0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,120,24,212.7,73,257.5,103,227.8,119,9.7,13,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,88,0,73.3,86,161.4,82,239.6,76,8.2,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,111,0,176.9,128,102.8,56,213.7,84,10.5,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,33,35,161.9,85,151.2,82,191.0,131,8.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,106,0,128.6,83,134.0,114,210.6,113,11.4,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,54,0,190.5,108,259.7,108,141.5,111,9.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,87,0,223.2,109,127.5,86,289.3,83,14.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,94,0,157.9,105,155.0,101,189.6,84,8.0,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,135,41,173.1,85,203.9,107,122.2,78,14.6,15,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,107,0,273.5,104,183.8,68,153.8,67,11.0,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,159,0,275.8,103,189.5,108,223.9,93,7.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,119.2,142,228.4,139,197.9,61,8.4,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,24,174.6,76,176.6,114,214.4,91,8.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,116,0,133.3,94,247.8,126,219.0,78,11.3,5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,115,33,145.0,72,194.5,157,242.3,138,14.2,3,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,103,0,150.6,125,169.1,126,221.2,104,10.4,8,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,95,37,220.2,109,185.3,99,205.1,82,4.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,115,0,109.7,148,223.8,87,240.3,96,15.4,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,143,0,155.4,112,290.9,92,228.4,91,13.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,48,43,172.0,111,200.2,64,233.1,96,8.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,94,0,235.6,131,194.8,107,170.6,93,8.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,1,0 +0,153,31,218.5,130,134.2,103,118.9,105,9.4,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,94,28,92.7,107,127.8,86,225.6,86,9.9,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,107,0,90.7,90,207.5,109,169.4,96,5.6,5,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,91,37,162.3,107,233.9,115,277.4,94,9.2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,141,0,146.5,121,169.9,125,238.8,112,8.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,58,0,210.1,126,248.9,108,158.6,88,14.4,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,49,28,214.4,78,235.2,100,206.2,107,8.0,13,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,41,34,194.4,63,254.9,110,160.2,115,17.2,9,2,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,137,0,237.3,103,176.7,84,263.4,81,14.2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,111,0,255.9,97,204.1,129,171.3,84,12.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,71,0,197.9,108,181.5,109,281.4,56,6.7,5,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,43,35,200.2,105,244.4,88,207.2,97,11.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,0,1 +1,97,0,120.8,96,169.8,101,194.1,63,11.9,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,3,36,118.1,117,221.5,125,103.9,89,11.9,6,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,0,131.8,82,284.3,119,305.5,101,11.3,2,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,29,225.4,79,187.1,112,281.1,112,12.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,87,0,205.2,106,99.5,122,189.5,75,13.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,83,30,272.5,105,253.0,83,180.8,123,8.7,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,67,35,181.1,59,215.9,116,216.3,106,16.9,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,46,0,122.2,67,167.2,62,194.8,98,9.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,33,119.6,104,278.7,88,263.4,175,5.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,90,0,109.6,88,137.6,108,159.7,121,11.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,0,112.7,119,217.7,109,152.1,76,6.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,136.3,97,172.2,108,137.5,101,7.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,141,37,185.4,87,178.5,128,218.3,107,8.0,3,4,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,136,0,199.6,89,211.4,96,72.4,84,11.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,88,0,218.2,76,169.3,60,141.1,99,8.0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,170,0,259.9,68,245.0,122,134.4,121,8.4,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,1,0 +0,44,0,143.2,77,169.8,114,215.8,77,7.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,24,218.2,88,348.5,108,212.6,118,7.5,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,111,0,249.8,109,242.4,106,231.8,78,11.6,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,24,274.7,99,193.5,118,299.6,109,10.8,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,112,0,167.8,88,247.9,81,155.1,108,11.9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,73,0,182.3,115,199.2,97,120.2,113,18.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,147,0,157.0,79,103.1,94,211.8,96,7.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,66,0,207.7,85,196.7,112,261.7,83,6.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,136,0,250.2,121,267.1,118,151.0,114,13.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,0,81.9,75,253.8,114,213.1,125,8.9,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,135,0,246.8,129,187.8,121,154.5,109,12.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,102,0,103.1,70,275.0,129,141.1,92,11.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,169,0,147.2,115,161.9,123,142.1,103,7.2,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,60,0,252.7,97,221.1,121,109.9,100,12.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,0,192.2,86,168.6,116,139.8,87,9.4,6,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,83,26,226.4,117,234.7,97,133.6,82,10.8,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,90,0,145.5,92,217.7,114,146.9,123,10.9,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,148,0,178.3,98,282.6,110,181.0,98,11.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,29,133.1,114,221.2,82,131.6,103,6.8,3,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,152,20,214.6,108,96.6,82,170.7,145,7.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,0,1 +0,136,33,203.9,106,187.6,99,101.7,107,10.5,6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,112,0,185.4,114,191.4,119,144.0,78,10.0,11,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,122,0,140.0,101,196.4,77,120.1,133,9.7,4,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,44,0,240.3,146,164.6,83,240.7,106,10.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,122,23,134.2,85,227.3,132,122.4,96,8.5,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,89,0,141.1,92,249.1,126,136.0,73,10.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,176,0,47.4,125,167.8,90,163.1,107,10.5,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,64,22,200.4,80,131.1,84,230.7,67,7.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,112,0,167.6,100,154.5,90,281.4,107,17.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,133,32,221.1,137,264.9,99,168.9,108,15.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,52,0,165.5,78,205.5,89,213.6,124,12.2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,34,175.3,96,262.3,122,143.9,76,5.6,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,127,0,146.7,91,203.5,78,203.4,110,13.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,153,22,167.7,104,246.8,91,203.9,117,7.5,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,117,0,184.5,97,351.6,80,215.8,90,8.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,163,0,202.9,100,178.6,46,203.8,116,12.8,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +1,76,0,273.3,66,263.6,121,165.2,84,12.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,80,0,194.8,116,209.9,93,194.1,100,12.8,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,27,187.7,84,221.0,147,145.7,110,10.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,91,0,133.7,75,195.3,87,280.5,89,5.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,143,0,209.1,127,106.1,80,179.6,90,14.0,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,125,29,260.8,81,163.7,112,271.7,117,17.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,126,0,211.6,70,216.9,80,153.5,60,7.8,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,0,156.8,93,215.8,68,223.3,77,7.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,119,0,109.2,96,153.1,80,240.0,102,9.8,5,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,13,0,303.2,133,170.5,86,227.6,80,11.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,138,0,240.8,104,144.5,92,125.7,98,11.6,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,159,0,167.4,68,143.8,74,140.1,111,10.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,111,0,110.4,103,137.3,102,189.6,105,7.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,46,0,90.4,108,276.2,77,146.5,111,12.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +1,68,0,162.1,86,155.0,86,189.7,87,11.0,9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,212.1,95,150.1,88,219.8,111,7.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,70,0,214.8,87,131.0,114,216.9,104,9.4,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,215,0,83.6,148,120.9,91,226.6,110,10.7,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,22,23,182.1,94,164.6,59,128.8,102,12.7,4,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,122,0,170.5,94,173.7,109,248.6,75,11.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,73,28,198.2,107,139.1,123,199.1,139,8.8,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,75,0,143.2,92,209.1,142,173.0,96,11.9,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,184.5,81,172.0,103,183.4,96,13.7,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,148,0,185.2,87,170.4,96,165.1,104,9.5,13,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,0,156.5,102,140.2,134,227.4,111,12.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,182,0,69.1,114,230.3,109,256.7,96,6.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,139,0,139.0,110,132.9,93,272.0,120,12.1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,0,101.4,48,159.1,119,259.2,53,12.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,166,0,274.3,110,52.9,109,246.1,119,10.9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,60,0,220.6,57,211.1,115,249.0,129,6.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,76,0,107.3,140,238.2,133,271.8,116,10.0,3,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,28,0,121.7,48,125.8,112,261.6,122,8.3,2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,163.5,136,143.7,111,253.4,82,12.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,146,19,176.6,88,162.7,66,215.5,98,14.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,101,16,118.9,112,228.3,97,180.1,111,8.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,132,0,240.1,115,180.4,91,133.4,122,8.0,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,93,0,179.3,93,188.8,65,253.2,88,12.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,0,246.4,83,256.2,101,169.0,151,3.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,38,177.1,88,163.7,108,242.7,72,7.4,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,134,0,258.8,85,129.5,114,193.6,106,10.9,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,63,0,211.8,84,230.9,137,217.1,99,10.7,9,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,126,0,226.2,88,140.3,114,208.9,110,6.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,166,0,220.7,106,177.8,118,206.1,102,12.4,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,160,0,166.4,117,317.0,129,160.4,121,10.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,162,0,115.1,89,196.8,111,212.4,98,11.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,70,0,213.4,86,204.7,77,256.6,101,5.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,19,155.7,104,185.4,118,192.7,116,8.2,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,75,46,214.1,62,200.9,111,246.8,126,9.2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,74,0,200.4,87,309.2,105,152.1,118,10.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,115,16,133.3,110,185.7,111,161.5,113,5.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,42,0,155.4,127,164.1,45,157.7,128,9.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,0,195.1,100,148.8,95,224.5,117,6.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,171,0,189.8,122,173.7,85,257.1,84,10.3,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,135,22,197.1,113,259.4,95,134.7,135,14.6,5,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,99,0,217.2,112,246.7,89,226.1,89,15.8,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,27,0,236.7,110,231.9,92,164.7,85,12.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,139,0,192.8,104,234.4,96,203.2,101,13.0,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,76,0,224.4,121,147.9,97,183.8,74,6.7,2,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,114.4,91,216.6,123,250.6,102,11.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,54,39,206.9,143,127.8,72,199.2,120,9.2,1,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,70,0,134.7,96,235.9,90,260.2,113,7.6,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,163,25,219.6,99,210.4,99,242.7,88,13.8,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,96,33,183.3,115,201.4,87,177.4,84,10.4,15,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,62,0,281.0,66,160.6,108,77.9,74,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,115,0,147.9,109,228.4,117,299.7,90,9.6,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,97,0,144.2,91,226.7,137,144.6,72,13.8,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,137,19,175.3,96,241.3,146,211.4,109,7.8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,82,0,199.3,112,193.4,120,254.4,117,7.0,10,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,118,36,294.9,106,165.7,115,189.2,63,9.8,5,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,64,0,97.2,80,186.2,90,189.0,92,10.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,186,26,74.3,107,177.3,116,296.3,90,14.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,117,0,161.6,104,196.3,119,294.8,111,13.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,117,0,181.5,95,205.1,88,204.0,82,14.7,9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,35,118.0,103,167.2,106,205.7,102,11.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,164,30,238.8,100,230.0,121,206.3,66,13.2,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,103,0,246.5,47,195.5,84,200.5,96,11.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,137,50,186.5,94,178.0,106,215.6,100,12.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,0,1 +0,97,28,202.3,97,69.2,84,257.6,64,6.7,3,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,144,0,201.1,99,303.5,74,224.0,119,13.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,96,26,145.8,108,192.2,89,165.1,96,9.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,183,0,116.7,92,213.8,112,214.3,112,9.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,42,0,303.9,106,232.2,54,147.1,76,5.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,100,0,107.2,98,86.8,122,156.2,117,9.7,4,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,131,0,211.8,115,260.5,102,144.2,96,10.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,0,113.7,67,165.1,127,141.5,142,10.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,91,0,149.0,115,245.3,105,260.0,94,8.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,104,0,118.5,92,177.8,109,255.7,98,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,63,49,214.9,86,198.2,89,170.8,139,8.2,5,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,159,15,113.9,102,145.3,146,195.2,137,11.8,9,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,147,0,225.2,111,184.9,98,143.2,146,9.9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,1,0 +0,123,0,172.2,92,162.6,76,250.3,101,8.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,100,27,221.7,100,236.1,70,192.7,91,8.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,105,0,150.0,106,293.8,123,250.7,65,10.3,7,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,163,23,160.0,104,189.4,64,229.9,118,10.4,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,90,0,142.4,126,126.2,118,274.2,71,4.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,125,0,182.3,64,139.8,121,171.6,96,11.6,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,219.6,126,303.3,100,154.5,65,9.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,113,0,92.6,85,177.6,92,159.8,72,14.4,4,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,101,21,238.0,88,209.6,84,233.0,95,10.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,123,0,224.0,99,210.7,80,231.9,75,2.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,212,0,226.0,127,304.6,83,181.2,132,12.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,73,0,224.4,90,159.5,88,192.8,74,13.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,44,0,204.6,117,205.2,94,164.6,84,10.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,96,26,175.8,96,206.6,84,178.0,105,11.1,2,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,74,33,193.7,91,246.1,96,138.0,92,14.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,77,0,169.4,102,184.9,144,234.3,89,2.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,0,252.1,110,226.1,103,155.6,83,13.8,3,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,122,0,173.6,110,91.7,84,211.7,103,9.7,7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,169.1,105,169.9,102,244.9,106,9.9,10,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,52,24,170.9,71,201.4,80,159.0,124,4.1,5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,48,36,230.9,92,167.6,121,270.0,87,7.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,61,40,105.0,78,180.6,100,174.1,115,10.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,141,0,215.6,113,200.6,81,153.8,107,12.4,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,170,0,285.7,44,167.5,144,260.0,97,8.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,17,35,198.5,123,270.6,74,209.9,130,8.1,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,162,0,220.6,117,155.2,121,186.7,89,10.5,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,1,0,1,0 +0,85,0,96.7,97,193.8,95,171.7,88,9.7,3,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,160,0,97.5,113,268.1,69,255.3,62,13.2,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,29,37,235.0,101,183.3,79,139.8,106,5.7,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,91,0,109.8,100,189.6,104,206.7,85,11.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,96,0,197.7,68,250.5,53,181.2,67,10.5,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,104,0,264.0,108,132.2,75,177.7,91,10.6,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,27,129.5,106,248.9,90,268.0,115,11.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,84,0,159.0,80,167.9,128,167.6,101,12.3,5,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,157,0,276.2,95,165.8,119,151.6,79,2.2,4,3,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,165,0,207.7,109,164.8,94,54.5,91,7.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,57,30,234.5,130,195.2,116,268.8,94,11.4,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,95,0,167.6,96,176.0,89,250.9,113,13.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,28,276.7,121,203.7,99,246.2,88,8.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,97,0,146.0,121,203.0,141,151.8,120,13.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,13,0,220.4,100,211.2,79,259.3,112,13.6,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,50,0,131.7,108,216.5,103,196.1,126,11.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +1,46,0,250.3,100,260.6,90,195.0,104,13.3,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,35,68.7,95,209.2,69,197.4,42,11.4,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,68,0,207.6,68,251.6,123,191.6,100,10.9,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,72,0,118.2,106,167.2,136,214.2,106,12.2,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,82,0,208.8,101,213.7,87,175.1,86,12.4,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,38,0,137.8,86,286.3,76,167.0,77,14.1,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,41,0,209.9,105,121.9,105,253.7,104,9.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,96,0,179.5,125,162.3,139,264.5,133,6.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,129,0,216.0,85,186.9,114,210.7,109,4.9,10,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,31,28,210.5,101,250.5,86,241.6,125,11.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,122,0,234.1,101,200.2,121,237.4,89,13.1,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,0,181.5,108,196.9,87,187.2,119,10.3,2,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,109,0,222.5,74,169.7,75,264.3,94,9.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,161,38,240.4,112,201.8,102,206.1,112,16.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,72,0,109.1,97,115.7,96,295.8,84,8.3,6,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,65,29,158.1,104,322.2,81,210.0,96,8.9,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,129,31,193.0,99,224.8,87,197.6,91,10.3,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,137,0,205.9,88,209.3,86,289.9,84,14.5,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,48,34,198.0,70,273.7,121,217.9,71,7.6,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,134,0,244.1,99,246.9,111,200.0,133,7.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,125,0,240.7,82,269.4,85,187.1,74,10.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,153,0,122.5,145,273.3,103,197.8,71,8.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,103,24,111.8,85,239.6,102,268.3,81,6.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,45,0,155.7,110,260.3,103,192.2,98,11.0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,80,38,236.6,69,197.5,68,209.5,102,9.5,10,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,57,0,149.3,100,200.2,110,231.7,101,11.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,94,0,181.3,135,182.4,108,180.6,103,6.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,59,0,151.8,98,209.9,92,266.9,86,11.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,72,0,139.9,117,223.6,96,240.8,93,12.7,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +1,62,0,248.7,109,220.0,118,265.7,78,13.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,155,30,61.6,103,255.1,110,225.9,96,12.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,96,21,247.6,95,256.3,150,158.6,72,10.8,6,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,77,0,185.9,95,212.0,98,282.3,81,11.3,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,58,30,178.1,111,236.7,109,264.0,118,8.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,134,32,80.3,94,199.9,124,170.8,117,16.6,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,24,0,235.6,132,115.9,129,185.4,136,16.2,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,172.4,114,256.6,69,235.3,104,0.0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,89,0,178.7,81,233.7,74,131.9,120,9.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,138,17,225.2,116,173.4,88,145.8,99,11.7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +1,61,27,187.5,124,146.6,103,225.7,129,6.4,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,123,0,204.4,88,137.5,111,226.0,100,10.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,134.2,80,165.0,71,173.1,102,10.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,74,0,262.3,114,198.9,96,165.9,90,6.6,5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,37,0,191.1,69,129.2,113,207.5,117,12.9,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,105,31,109.6,108,249.3,119,321.2,101,8.3,4,4,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,56,0,197.0,110,222.8,102,225.3,91,10.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,64,31,228.6,88,248.5,109,167.1,124,9.0,1,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,202,0,115.4,137,178.7,70,185.7,113,6.0,3,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,91,0,147.2,121,175.2,87,136.3,80,13.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,120,0,198.8,56,230.1,73,119.8,81,9.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,89,0,129.2,71,214.1,68,214.9,100,10.3,4,5,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,95,0,238.1,65,187.2,98,190.0,115,11.8,4,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,0,208.0,125,198.9,76,76.4,97,8.6,6,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,45,0,211.3,87,165.7,97,265.9,72,13.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,194.8,133,213.4,73,190.8,92,11.5,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +1,125,0,143.2,80,88.1,94,233.2,135,8.8,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,129,0,143.7,114,297.8,98,212.6,86,11.4,8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,159,0,198.8,107,195.5,91,213.3,120,16.5,7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,99,33,179.1,93,238.3,102,165.7,96,10.6,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 diff --git a/sagemaker_model_monitor/introduction/test_data/upload-test-file.txt b/sagemaker_model_monitor/introduction/test_data/upload-test-file.txt new file mode 100644 index 0000000000..bdf08de0f3 --- /dev/null +++ b/sagemaker_model_monitor/introduction/test_data/upload-test-file.txt @@ -0,0 +1 @@ +test file \ No newline at end of file diff --git a/sagemaker_model_monitor/introduction/test_data/validation.csv b/sagemaker_model_monitor/introduction/test_data/validation.csv new file mode 100644 index 0000000000..0326df1a9e --- /dev/null +++ b/sagemaker_model_monitor/introduction/test_data/validation.csv @@ -0,0 +1,666 @@ +0,47,28,141.3,94,168.0,108,113.5,84,7.8,2,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,30,0,247.4,107,175.9,76,287.4,90,11.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,32,165.9,126,216.5,93,173.1,86,14.1,8,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,131,0,240.9,108,167.4,91,322.2,109,14.7,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,83,37,78.5,109,210.5,101,179.7,102,11.8,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,0,1 +0,142,22,224.4,114,146.0,106,241.4,98,8.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,81,0,183.6,116,152.6,98,212.2,99,12.2,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,78,0,231.0,115,230.4,140,261.4,120,9.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,73,0,198.3,94,279.3,101,146.2,87,14.8,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,77,33,143.0,101,212.2,102,104.9,120,15.3,4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,3,0,161.0,96,244.9,82,180.8,103,7.7,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,123,0,260.9,85,168.5,103,178.3,91,13.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,72,0,217.8,93,189.7,113,182.6,91,10.4,5,4,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,130,0,207.1,70,200.1,115,194.2,100,12.4,2,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,168.2,92,224.7,100,169.5,99,12.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,55,25,165.6,123,136.1,95,175.7,90,11.0,2,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,167,0,131.6,120,211.3,96,168.3,97,11.1,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,178,0,275.4,150,187.5,62,147.1,126,13.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,102,0,271.1,80,172.0,133,169.2,105,10.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,72,0,166.5,102,261.0,103,262.7,85,13.3,5,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,36,192.8,103,177.0,83,216.5,118,16.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,56,0,162.3,99,149.1,78,255.5,115,14.8,1,4,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,0,199.2,122,214.7,114,150.9,105,11.8,7,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,86,0,266.1,120,182.0,92,206.5,103,10.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,27,0,201.2,128,227.2,100,145.8,91,8.4,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,64,0,154.0,67,225.8,118,265.3,86,3.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,109,0,154.8,82,287.7,109,208.4,80,5.9,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,103,0,160.8,91,155.8,82,254.3,103,8.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,111,21,127.1,94,228.3,116,166.7,108,7.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,105,28,156.1,89,107.1,114,167.7,95,14.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,156,27,192.3,137,199.9,115,244.2,112,14.8,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,108,0,73.8,105,143.4,114,170.2,98,10.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,119,29,117.8,66,256.8,114,147.6,76,7.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,102,29,214.7,86,314.3,109,280.2,110,14.3,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,142,0,232.5,74,181.8,142,203.1,86,10.4,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,70,31,125.9,101,196.4,102,252.7,75,10.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,129,0,139.5,119,289.3,105,129.4,97,13.1,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,101,0,89.7,118,260.1,79,170.1,93,13.5,11,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,84,0,226.9,144,201.6,122,130.2,121,13.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,0,78.7,98,225.6,102,150.4,106,14.0,8,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,54,39,143.9,73,210.3,117,129.2,117,12.5,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,0,1 +0,166,0,204.2,115,179.9,152,216.8,109,9.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,52,0,124.9,131,300.5,118,192.5,106,11.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,99,0,145.6,106,98.3,106,230.8,83,10.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,75,0,122.8,89,211.3,104,261.4,91,10.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,191,0,162.0,104,241.2,120,210.4,83,10.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +0,98,0,288.1,101,137.9,93,206.5,88,0.0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,171,0,270.5,69,230.0,112,136.0,111,9.6,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +1,127,0,242.2,102,226.1,80,252.0,96,13.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,1,0 +0,98,29,111.1,105,217.9,101,248.1,108,6.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,52,0,191.9,108,269.8,96,236.8,87,7.8,5,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,99,0,191.2,110,163.9,102,243.6,114,14.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,19,180.1,106,127.5,92,237.4,118,7.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,86,0,70.7,125,211.0,113,174.6,107,0.0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,116,0,159.4,79,179.5,88,167.8,71,9.7,2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,48,27,141.1,109,224.7,94,174.3,122,13.2,2,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,66,0,116.4,98,95.6,74,181.5,94,10.5,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,12,135.8,60,200.6,134,192.4,98,12.3,7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,114,4,141.3,96,230.4,88,223.7,85,9.4,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,78,0,155.0,106,175.3,101,155.6,125,11.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,50,0,183.6,107,58.6,118,202.6,99,8.7,3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,48,0,128.2,71,48.1,78,116.3,80,8.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,63,33,184.2,111,312.6,89,264.0,55,12.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,35,0,260.8,87,258.1,78,131.3,123,5.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,73,0,241.7,115,168.5,133,169.8,122,11.1,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,23,160.3,87,202.4,101,191.1,122,7.4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,43,0,84.2,134,80.8,103,196.1,79,10.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,212.3,107,228.4,103,163.3,116,7.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,162,46,224.9,97,188.2,84,254.6,61,12.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,179,0,116.1,101,201.8,99,181.9,103,11.6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,97,32,183.4,94,269.1,120,203.5,38,6.7,4,5,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,88,28,190.6,104,237.3,105,211.6,116,9.8,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,39,0,154.1,104,204.2,112,196.2,92,9.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,122,28,166.0,62,233.9,88,170.1,84,7.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +0,50,26,307.1,94,289.4,78,174.9,109,8.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1 +0,119,27,220.1,128,268.2,133,146.5,80,11.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,87,0,204.8,101,161.0,80,285.7,89,9.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,105,0,273.8,97,289.7,106,269.1,126,5.8,3,2,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,152,20,237.5,120,253.4,94,265.2,80,14.2,3,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,134,0,164.9,115,126.5,96,238.5,125,10.0,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,100,33,218.7,104,155.0,144,99.0,117,12.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,68,0,178.7,61,252.3,84,255.7,76,8.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,128,0,125.2,99,205.4,107,254.4,111,18.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,120,39,200.3,68,220.4,97,253.8,116,10.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,35,0,138.1,115,158.2,82,215.7,118,10.3,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,112,23,286.6,79,315.3,102,193.9,101,10.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,33,0,213.9,88,239.8,119,148.7,71,9.8,14,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,106,9,88.5,100,324.8,109,79.9,86,8.2,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,131,0,175.1,73,171.9,116,131.1,94,7.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,62,0,147.1,91,190.4,107,195.2,115,12.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,38,0,194.4,94,186.7,95,223.3,90,10.8,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,131,33,177.1,100,194.0,85,253.4,124,5.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,96,0,260.4,115,146.0,46,269.5,87,11.5,4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,111,0,132.6,125,221.1,67,127.9,101,12.7,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,24,0,241.9,104,145.2,112,214.5,105,6.6,5,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,162,0,184.5,118,224.0,95,180.5,82,11.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,113,0,125.2,93,206.4,119,129.3,139,8.3,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +1,113,0,128.7,100,227.1,67,178.1,135,9.2,4,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,194,0,48.4,101,281.1,138,218.5,87,18.2,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,157,0,152.7,105,257.5,80,198.1,93,9.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,28,0,168.2,87,161.7,92,192.4,112,10.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,48,0,198.2,73,202.8,115,146.4,73,5.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,81,0,173.2,80,236.2,94,240.2,84,11.8,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,50,31,302.7,93,240.5,119,193.9,103,13.6,14,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,83,38,107.9,90,140.4,94,253.6,79,10.5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,122,0,145.6,102,284.7,111,228.2,91,12.2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,128,0,147.7,94,283.3,83,188.3,124,6.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,30,110.3,71,182.4,108,183.8,88,11.0,8,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,135,0,239.9,91,177.1,104,217.2,118,5.9,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,0,87.7,74,214.8,58,201.3,147,10.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,0,263.4,118,179.1,69,214.7,112,10.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,73,47,173.7,117,204.0,114,174.6,94,6.3,3,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,137,22,189.6,42,179.0,137,179.6,126,11.4,5,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,130,0,154.0,95,205.9,106,233.7,75,12.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,162,0,177.1,131,114.7,122,153.6,88,6.5,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,1,0 +0,34,0,128.8,80,208.7,93,202.1,103,14.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +1,75,0,109.0,88,259.3,120,182.1,119,13.3,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,50,0,188.9,94,203.9,104,151.8,124,11.6,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,109,0,184.1,143,211.7,105,243.0,116,9.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,0,168.4,117,217.1,129,81.6,100,11.8,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,105,42,101.9,79,223.1,97,241.6,77,12.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,161,0,151.6,117,219.4,87,224.7,68,4.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,112,0,189.4,83,219.0,89,168.0,116,7.1,8,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,31,31,100.1,54,246.3,97,255.0,131,5.9,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,133,0,277.3,138,228.4,117,117.3,103,12.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,0,220.9,107,192.2,97,161.0,74,12.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,20,211.9,110,215.1,120,238.5,107,9.4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,171,0,231.2,135,188.7,74,206.9,124,12.3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,82,0,197.7,101,127.6,83,142.1,103,13.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,56,29,37.7,115,144.1,111,226.6,101,4.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,0,1 +0,116,12,221.0,108,151.0,118,179.0,80,9.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,0,1 +0,119,0,134.9,70,211.5,74,188.5,105,11.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,172,0,287.1,108,178.4,125,153.2,98,14.4,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,102,0,174.5,79,236.8,136,270.4,110,8.5,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,100,25,246.6,94,141.4,112,189.8,109,13.0,5,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,144,0,139.6,96,124.2,93,95.6,75,15.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,49,0,236.6,91,220.9,146,146.8,114,8.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,132,36,201.9,93,156.3,75,131.3,92,13.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,148,0,218.7,111,155.6,133,277.4,62,8.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,92,0,91.7,90,193.7,123,175.0,86,9.2,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,82,0,266.9,83,229.7,74,251.7,99,11.0,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,54,0,116.8,119,123.1,123,217.5,101,12.0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,92,31,172.3,116,266.2,91,228.2,90,11.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,130,45,174.5,120,217.5,95,220.3,67,12.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,128,0,103.3,122,245.9,123,161.1,95,6.4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,69,0,194.2,122,242.1,81,215.8,80,9.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,72,37,220.0,80,217.3,102,152.8,71,14.7,6,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,87,0,186.9,79,182.6,105,143.1,90,4.2,14,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,126,0,58.2,94,138.7,118,136.8,91,11.9,1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,37,223.5,104,235.1,99,140.1,90,10.6,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,128,0,195.6,99,267.8,120,164.9,76,16.0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,65,0,129.1,137,228.5,83,208.8,111,12.7,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,221,24,180.5,85,224.1,92,205.7,103,2.4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,205,24,175.8,139,155.0,98,180.7,64,7.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,55,0,175.6,147,161.8,118,289.5,55,9.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,148,0,86.3,134,246.6,92,251.6,91,11.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,84,0,75.3,96,179.9,113,193.8,134,12.3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,134,41,162.0,82,324.7,77,160.1,112,11.9,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,153,0,195.4,107,154.6,96,142.8,97,11.6,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,16,0,205.6,69,169.5,93,220.1,64,10.9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,1,0,1,0 +0,121,31,237.1,63,205.6,117,196.7,85,10.1,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,54,0,286.6,73,223.2,108,203.7,107,11.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,80,0,113.2,86,185.5,97,237.3,145,9.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,114,26,137.1,88,155.7,125,247.6,94,11.5,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,0,1 +0,75,0,190.5,91,178.4,75,162.4,113,13.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,138,0,169.3,82,217.9,147,184.2,77,9.4,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,132,0,291.2,104,234.2,132,191.7,87,8.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,174,0,190.3,98,252.7,70,220.6,97,7.2,9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,30,0,54.0,68,179.3,96,247.2,101,10.2,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,149,0,156.0,56,56.0,116,163.3,104,8.9,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,52,0,148.3,83,181.6,79,155.6,104,8.3,6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,146,31,202.5,91,241.4,108,169.6,77,7.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,120,26,239.4,94,259.4,88,238.0,132,7.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1,0,0,1 +0,45,26,91.7,104,150.6,119,63.3,103,7.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,194,0,193.3,106,169.0,150,225.2,122,11.8,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,139,0,236.6,109,169.9,107,212.3,118,11.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,115,0,99.7,107,145.1,96,149.4,99,14.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,141,0,260.2,131,179.2,120,135.0,119,7.2,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,62,0,98.9,103,135.4,122,236.6,82,12.2,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,119,0,294.2,100,232.5,53,195.0,64,9.0,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,40,221.6,79,157.1,74,222.4,124,11.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +1,118,0,267.8,145,316.4,121,208.6,91,14.4,11,5,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,41,22,213.8,102,141.8,86,142.2,123,7.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,38,0,117.3,114,208.7,105,203.4,98,14.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,19,0,259.4,116,269.7,109,175.3,130,9.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,108,32,209.5,108,109.6,64,189.7,145,9.1,6,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,58,0,112.2,95,209.6,108,260.9,78,13.9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,29,0,157.4,122,145.0,75,281.8,92,9.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,70,0,156.4,108,171.0,116,196.1,96,8.6,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,94,0,190.6,108,152.3,95,144.7,97,7.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,61,31,288.7,101,203.8,102,203.2,49,8.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,82,24,155.2,131,244.5,106,122.4,68,10.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,18,0,197.0,97,203.7,107,202.0,105,8.7,3,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,92,0,264.3,91,160.9,115,198.6,73,9.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,1,0 +0,130,12,141.9,92,228.9,102,195.1,101,8.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,68,0,231.1,57,153.4,55,191.3,123,9.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,88,0,172.8,81,193.4,90,89.6,107,12.8,5,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,168,0,163.4,134,240.1,87,164.0,147,11.6,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,123,32,212.3,77,251.5,78,208.7,85,6.6,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,69,0,153.7,109,194.0,105,256.1,114,14.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,23,197.1,125,214.5,136,282.2,103,9.5,5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1 +1,117,0,118.4,126,249.3,97,227.0,56,13.6,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,136,29,85.2,98,230.4,85,243.6,104,9.0,3,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,161,0,105.4,70,214.8,122,223.6,126,7.8,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,96,0,276.9,105,246.9,94,254.4,107,10.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,101,0,156.4,116,130.4,114,207.3,109,7.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,77,0,221.8,84,166.0,125,210.2,72,13.2,4,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,110,0,131.9,93,272.7,106,192.8,105,7.1,4,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,73,0,94.1,136,280.3,122,205.0,77,9.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,119,35,217.1,92,220.8,134,249.5,93,8.0,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,37,0,106.6,76,147.4,89,235.8,113,9.6,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,143,0,133.4,107,223.9,117,180.4,85,10.2,13,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,130,0,176.9,109,90.7,104,238.0,69,9.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,207.2,121,292.5,104,226.3,103,8.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,84,41,153.9,102,140.7,117,217.7,101,12.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,30,0,145.0,76,240.7,112,197.1,134,7.1,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,107,26,161.6,123,195.5,103,254.4,103,13.7,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,160,0,121.8,97,89.3,97,150.7,92,10.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,74,0,155.7,116,173.7,63,257.4,97,8.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,157,0,196.0,74,213.4,96,196.8,81,7.9,6,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,49,0,119.4,69,273.3,92,214.4,153,12.4,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,95,20,165.7,78,215.6,94,243.3,91,9.8,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,111,0,152.2,114,137.2,102,185.9,97,9.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,138,0,171.4,117,115.2,128,224.5,115,17.0,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,84,30,106.5,65,225.7,108,188.6,61,5.7,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,101,25,144.1,144,167.6,105,240.0,107,14.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,121,0,215.6,74,192.9,98,144.0,103,10.1,4,5,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,76,0,263.4,148,230.3,69,170.6,101,11.4,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,129,0,150.0,98,232.4,101,261.2,123,12.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,68,0,157.3,83,220.9,85,218.9,129,12.0,7,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,127,0,247.5,99,108.5,118,232.0,72,10.6,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,1,0 +0,107,0,194.5,97,186.3,131,178.3,106,12.7,1,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,87,0,151.0,83,219.7,116,203.9,127,9.7,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,120,0,177.2,88,270.4,99,231.5,90,14.0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,37,0,199.5,107,207.5,110,83.9,123,8.1,4,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,147,36,254.2,78,228.1,105,98.0,125,13.8,7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,103,0,198.5,112,42.5,90,179.2,124,12.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,135,24,127.7,54,215.0,105,234.3,84,5.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,68,0,219.6,97,141.1,144,205.7,101,10.8,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,155,0,216.7,30,144.3,125,135.3,106,10.8,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,148,11,252.9,129,284.3,88,262.8,99,12.3,1,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,0,169.3,108,178.6,91,242.3,82,12.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,148,0,153.6,148,262.1,87,225.5,99,9.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,96,0,180.6,92,190.9,114,295.6,125,10.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,86,0,128.3,121,197.1,93,138.4,152,12.2,5,7,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,116,24,183.6,138,203.8,90,166.9,89,6.0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,96,0,106.6,128,284.8,87,178.9,92,14.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,120,0,150.6,85,119.0,128,232.9,123,6.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,44,0,288.8,86,175.9,87,215.4,106,9.5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,149,0,217.7,91,273.5,74,226.9,99,9.6,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,93,38,225.7,117,119.6,122,193.2,125,14.0,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,134,0,177.2,91,228.7,105,194.3,113,8.9,3,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,143,0,223.3,99,167.1,128,203.0,84,4.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,23,0,321.6,107,251.6,115,141.1,158,11.3,3,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,138,28,211.2,117,312.5,98,178.0,118,10.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,177,0,189.5,99,176.3,117,225.9,112,14.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,137,0,115.0,130,137.8,83,224.0,61,7.3,4,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,201,0,225.9,110,299.1,86,251.3,81,11.2,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,145,0,224.2,89,314.9,121,182.9,121,16.1,3,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,129,0,192.9,131,185.5,101,205.2,130,10.9,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,120,31,153.5,83,219.1,96,237.4,76,11.4,4,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,126,31,278.0,88,253.2,65,223.2,114,8.7,4,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,122,0,157.1,134,184.9,122,197.2,59,8.5,5,4,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,196,0,234.0,109,249.5,114,173.1,70,9.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,222.0,93,187.0,103,282.3,124,12.4,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,192,0,221.6,101,285.2,50,167.4,83,12.7,6,4,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,121,26,170.4,91,254.5,90,219.6,122,15.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,165,39,167.4,113,172.7,94,192.6,113,9.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,185,29,151.1,121,244.7,88,154.4,91,13.8,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,90,0,140.2,97,213.9,102,120.0,126,7.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,131,0,110.9,74,115.6,90,190.5,114,15.8,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,76,0,179.2,85,222.9,66,188.2,113,12.4,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,132,0,214.6,78,251.7,98,240.8,88,13.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,0,102.8,74,281.7,125,228.1,113,13.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +1,117,0,54.2,100,303.2,84,171.8,84,8.6,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,105,0,193.7,108,183.2,124,293.7,72,10.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,93,0,190.2,68,262.2,64,130.0,92,8.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,88,0,264.8,124,245.4,112,160.5,115,14.8,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,137,0,206.4,122,128.0,102,194.5,84,8.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,66,29,229.4,104,257.4,84,231.5,119,8.0,1,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,141,23,149.7,112,162.5,118,220.3,115,7.6,2,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,161,39,218.5,76,112.7,94,205.1,121,7.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,53,18,146.8,107,310.0,84,178.7,130,7.2,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,68,0,94.1,93,147.6,80,213.5,85,10.1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,97,0,145.0,103,294.3,93,239.8,120,11.0,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,122,45,147.8,85,147.4,93,203.5,110,14.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,104,0,263.4,101,235.5,117,102.0,146,13.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,32,0,157.9,88,180.8,132,132.5,102,12.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,30,0,137.6,108,162.0,80,187.7,126,5.8,10,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,114,0,191.5,88,175.2,78,220.3,118,0.0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,94,0,108.0,79,241.9,152,252.1,92,10.4,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,113,0,187.6,97,208.2,118,158.9,101,8.7,6,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,129,0,334.3,118,192.1,104,191.0,83,10.4,6,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,44,34,221.8,105,161.7,85,227.7,62,14.0,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,142,0,216.8,134,187.8,106,138.1,108,8.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,31,103.1,90,243.0,135,76.4,92,12.2,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,131,30,174.0,118,205.3,81,218.2,90,6.7,3,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,75,0,224.7,116,192.0,79,212.2,98,11.3,11,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,220.7,120,270.2,95,121.6,113,8.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,103,0,189.8,110,115.5,83,191.3,103,12.2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,129,0,137.8,120,225.8,110,145.2,95,10.2,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,99,0,135.7,107,208.4,103,209.0,95,8.8,3,7,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,72,39,92.8,98,271.2,115,167.1,83,5.8,7,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,115,0,249.9,95,242.5,104,151.7,121,15.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,1,0 +0,94,0,194.1,62,227.2,54,190.4,115,15.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,0,160.9,109,144.2,152,120.4,97,12.9,12,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,68,0,158.8,119,211.8,105,198.1,101,10.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,93,31,237.2,85,213.1,100,192.7,87,10.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,138,0,196.2,129,176.5,86,232.4,108,15.2,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,74,32,174.6,107,310.6,115,234.7,92,9.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,116,0,192.1,98,312.9,135,130.2,94,7.9,2,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,117,0,97.1,98,228.0,131,240.0,111,10.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,72,0,198.4,147,216.9,121,112.8,125,13.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,27,0,226.3,95,274.3,109,242.7,119,8.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,103,0,129.3,103,202.8,89,233.0,126,12.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,75,21,175.8,97,217.5,106,237.5,134,5.3,4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,42,32,163.8,80,177.8,123,190.4,106,8.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,108,0,199.3,104,224.2,92,140.1,57,15.2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,63,34,152.2,119,227.1,91,195.7,103,12.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +0,170,0,246.4,107,228.1,124,166.4,95,9.1,8,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,51,0,259.9,114,176.2,94,77.2,112,15.3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,141,39,116.9,127,276.5,88,289.9,125,12.3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,96,18,172.7,86,133.4,113,259.5,70,9.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,0,1 +1,208,0,326.5,67,176.3,113,181.7,102,10.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,79,0,144.0,90,135.8,91,212.4,129,13.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,177,0,175.4,99,155.3,83,179.4,86,11.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,86,0,141.3,72,154.3,95,210.6,91,8.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,131,0,131.6,95,179.3,109,251.2,129,15.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,195.9,103,89.1,95,302.2,82,10.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,46,0,40.4,105,172.4,83,145.1,89,9.0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,161,0,194.2,106,249.4,105,254.9,129,12.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,121,0,179.4,70,143.0,93,116.3,113,11.2,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,231.3,87,224.7,88,214.6,69,7.2,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,119,26,161.3,97,250.3,110,142.4,92,6.6,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,0,1 +1,92,0,184.7,60,262.0,73,239.5,120,12.3,6,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,51,0,229.7,129,336.0,104,192.8,128,9.6,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,121,0,255.1,93,266.9,97,197.7,118,8.8,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,76,0,165.7,94,257.4,80,170.8,114,10.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,59,0,189.7,100,115.9,133,220.6,115,7.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,56,0,253.2,95,188.0,116,142.0,133,4.4,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,41,0,232.1,74,327.1,88,226.5,119,10.9,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,109,27,166.9,85,221.2,92,197.3,97,12.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,115,0,251.3,69,252.5,96,118.3,112,9.9,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,63,29,142.3,107,118.7,56,240.1,91,6.6,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,43,0,199.9,108,288.4,80,180.6,103,11.3,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,101,29,121.1,116,186.4,100,241.7,75,10.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,151,0,156.4,108,233.4,118,195.7,141,7.7,6,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,70,0,152.8,145,183.6,102,151.8,75,10.5,2,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,33,200.1,108,188.9,122,205.1,90,15.5,4,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,48,0,300.4,94,133.2,103,197.4,94,7.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,62,32,218.4,93,236.7,132,192.2,137,13.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1 +0,162,33,184.5,139,183.2,78,127.4,106,12.3,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,116,34,268.6,83,178.2,142,166.3,106,11.6,3,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,179,0,234.5,134,164.2,94,191.4,72,6.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,0,140.6,109,178.6,51,217.0,83,6.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,33,0,182.5,65,232.1,96,149.2,82,7.5,2,2,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,75,37,121.5,97,271.4,110,248.7,97,11.3,5,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,119,0,133.4,102,204.6,71,196.9,103,11.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,146,0,133.0,65,262.8,93,214.3,128,11.2,3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,162,26,179.7,144,218.1,129,212.3,105,9.3,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,163,0,122.4,129,113.4,108,180.2,97,12.5,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,85,0,165.4,107,196.0,126,349.2,110,9.6,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,0,1,1,0 +1,66,0,167.3,91,230.0,68,191.7,118,10.6,5,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,237.2,124,222.6,87,173.3,81,11.0,3,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,122,0,296.4,99,214.8,89,133.9,107,11.4,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,125,0,131.8,97,136.7,100,308.2,119,7.7,6,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,38,219.4,92,180.5,73,104.1,91,11.0,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,104,0,148.2,108,161.8,113,259.3,103,11.0,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,101,0,209.6,107,228.8,96,172.4,85,7.6,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,113,0,245.3,108,259.9,140,204.3,115,10.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,50,35,192.6,97,135.2,101,216.2,101,7.9,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,167,0,166.4,85,243.2,135,229.2,95,9.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,107,0,86.8,95,108.1,85,204.3,87,13.2,3,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,63,0,278.0,102,266.4,114,224.1,118,13.1,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,211.0,92,217.0,102,214.8,104,9.8,7,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,24,114.1,95,161.5,86,176.3,90,13.0,9,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,37,0,172.9,119,183.0,86,226.4,100,9.8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,163,0,223.0,120,227.0,98,188.3,125,8.8,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,130,0,68.4,86,193.3,110,171.5,139,10.4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,135,0,218.8,123,242.8,64,85.8,80,10.3,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,231.3,105,171.7,108,67.7,136,13.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,125,36,201.3,117,42.2,78,125.7,104,5.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,104,0,183.6,133,120.7,98,215.1,112,12.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1,0,1,0 +0,141,0,242.8,90,234.1,80,211.5,104,6.0,3,5,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,138,0,133.9,87,166.4,110,193.5,139,15.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,21,161.2,114,252.2,83,160.2,92,4.4,8,4,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,149,0,187.6,83,201.4,81,264.2,79,8.8,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,137,0,208.8,120,225.3,100,221.6,130,11.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,0,1,0,1,0 +0,127,0,102.8,128,143.7,95,191.4,97,10.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,120,33,299.5,83,163.4,84,146.7,88,11.6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,64,0,113.8,97,192.3,97,214.9,89,10.4,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,124,0,184.8,74,175.1,84,158.2,95,10.5,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,34,0,193.7,74,126.9,84,221.2,166,8.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,109,0,268.4,85,150.6,131,297.9,84,9.7,8,2,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,127,0,266.6,106,264.8,168,207.2,119,5.9,2,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,153,0,193.8,90,195.3,121,182.7,108,8.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,131,0,94.4,80,215.1,101,179.7,108,13.1,9,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,93,0,216.9,61,207.4,120,221.7,110,17.5,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,57,0,192.8,68,158.0,86,235.5,105,12.7,6,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,55,0,286.7,100,134.4,121,192.9,122,6.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,94,0,269.2,104,193.8,144,257.6,61,8.9,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,75,0,166.7,113,148.3,122,186.9,121,10.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,80,0,148.6,106,210.8,65,203.7,86,10.0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,60,31,191.8,75,267.8,135,200.5,62,12.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,115,0,180.0,119,198.8,126,217.1,70,12.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,160,0,85.8,77,165.3,110,178.5,92,9.2,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,42,193.3,66,263.3,85,214.4,97,11.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,50,0,258.1,106,161.4,106,225.1,110,11.7,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,142,0,140.8,140,228.6,119,152.9,88,10.9,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,137,0,155.5,81,133.1,94,253.1,77,9.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,189,0,208.3,106,236.7,123,179.1,120,11.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,73,0,183.4,80,242.0,115,201.4,100,7.5,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,60,0,179.3,147,208.9,89,248.2,98,13.5,6,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,165,0,154.2,91,268.6,108,188.8,99,10.9,4,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,1,0,1,0 +0,101,0,248.6,102,174.9,93,207.2,86,6.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,168,0,128.8,96,104.9,71,141.1,128,11.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,134,0,178.0,110,153.8,64,236.6,105,11.7,4,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,54,0,214.1,77,240.5,94,188.9,75,10.1,3,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +1,53,0,228.6,117,132.8,123,227.2,124,10.1,2,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,23,0,113.1,74,168.8,95,262.9,126,6.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,96,0,150.0,122,218.5,116,212.4,89,9.8,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,82,0,125.7,96,207.6,137,183.1,103,12.9,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,65,0,116.8,87,178.9,93,182.4,150,14.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,84,0,299.4,71,61.9,88,196.9,89,6.6,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,31,0,97.5,129,260.4,78,88.7,100,7.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,147,35,197.3,134,141.1,99,212.1,90,10.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,101,17,193.9,71,189.8,81,196.3,97,12.6,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,122,0,215.6,86,167.8,59,207.0,67,6.4,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,77,0,251.8,72,205.7,126,275.2,109,9.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,97,0,225.1,90,279.5,127,233.8,103,8.8,4,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,116,29,162.3,91,279.3,79,192.7,131,11.7,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,50,0,301.7,82,167.1,118,72.2,89,10.5,6,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,99,0,95.4,105,207.2,101,136.0,117,5.6,5,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,116,0,164.6,110,270.6,103,230.4,109,8.0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,35,0,105.6,129,258.2,129,213.1,77,8.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,185,31,189.8,126,163.3,133,264.8,126,7.5,3,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,108,0,154.2,123,112.3,86,246.4,75,15.4,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,78,0,130.8,64,223.7,116,227.8,108,10.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,123,0,242.2,87,226.1,101,268.6,121,8.2,3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,176,0,223.2,76,214.4,131,154.4,80,10.1,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,89,24,97.8,98,207.2,67,214.5,126,5.9,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,75,0,111.7,121,237.3,119,253.5,110,13.1,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,55,0,221.0,115,165.4,97,235.4,117,9.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,138,29,190.1,87,223.2,123,256.2,130,14.2,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,108,0,215.6,78,195.3,119,194.4,65,3.6,5,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,193,31,71.2,58,124.7,105,155.5,108,11.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,122,0,141.4,128,146.4,70,123.0,75,8.1,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,159,0,257.1,53,312.2,127,183.0,82,8.8,6,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,11,28,190.6,86,220.1,122,180.3,80,6.0,3,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,52,0,204.4,97,273.2,128,179.6,118,11.0,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,106,0,159.6,94,276.8,118,223.5,65,8.8,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,146,0,138.4,104,158.9,122,47.4,73,3.9,9,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,78,0,137.4,109,237.6,49,206.7,136,14.0,11,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,41,30,191.7,109,193.0,86,149.4,93,11.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,124,0,253.5,104,117.9,123,248.5,104,14.0,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,117,0,214.4,94,138.0,149,148.7,102,9.9,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,58,22,224.1,127,238.8,85,174.2,86,11.5,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,130,0,124.3,70,270.7,99,239.5,83,3.5,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,1,0,1,0 +0,92,33,243.1,92,213.8,92,228.7,104,12.1,2,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,161,0,178.1,109,146.5,86,137.6,78,8.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,139,20,214.6,101,235.1,132,162.8,132,14.8,12,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,86,0,126.3,115,168.8,112,154.6,95,9.8,7,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,58,0,149.4,145,196.5,105,209.5,108,14.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,99,0,256.4,44,214.5,105,233.7,75,7.9,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,77,0,230.0,87,103.2,138,309.6,136,11.3,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,85,37,229.6,123,132.3,90,211.9,76,9.5,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,96,0,173.9,111,287.4,105,204.8,91,9.1,7,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,83,0,202.3,87,201.5,111,101.7,82,6.8,4,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,53,24,145.7,146,220.5,136,249.9,96,13.1,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,57,0,189.3,157,174.9,70,221.9,117,11.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,58.0,125,67.5,116,185.9,136,11.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,127,25,203.8,118,267.1,48,225.1,105,7.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,66,0,256.3,135,180.2,106,187.3,135,6.2,7,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,111,0,99.3,112,270.5,136,225.3,94,9.0,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,111,0,222.2,96,162.5,111,184.9,120,11.9,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,80,0,239.9,121,142.3,51,364.3,106,9.3,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +1,109,29,111.2,90,263.5,98,224.7,128,9.0,6,6,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,85,0,127.9,107,271.2,124,202.2,76,12.5,5,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,91,20,146.1,98,277.4,104,137.7,100,6.2,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,101,0,269.7,85,207.6,86,214.2,107,4.5,15,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,225,0,165.4,106,273.7,109,210.0,93,8.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,111,0,224.9,117,191.9,127,229.9,97,10.3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,81,0,324.7,48,236.4,82,187.6,78,13.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,172,0,211.7,100,198.7,101,301.7,136,6.5,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,157,0,185.1,92,213.0,85,196.1,85,8.5,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,93,36,178.7,134,178.6,102,126.8,82,8.0,4,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,98,0,72.8,107,186.4,103,175.3,110,10.5,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,224.8,111,190.0,101,221.4,110,9.2,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,190,0,169.4,102,253.5,113,197.1,93,8.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,169,0,142.5,82,231.4,110,131.2,67,10.0,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,101,0,190.7,72,208.6,103,203.8,111,8.8,8,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,33,0,159.5,115,195.4,118,102.4,86,7.1,7,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,172,0,203.9,109,234.0,123,160.7,65,17.8,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,123,0,150.0,98,89.8,95,326.0,91,11.1,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,103,28,121.0,105,270.4,100,160.5,76,7.7,4,2,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,64,0,346.8,55,249.5,79,275.4,102,13.3,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,106,0,52.2,106,220.1,113,112.3,95,11.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,43,0,159.5,99,119.7,149,173.9,126,6.8,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,109,0,264.7,69,305.0,120,197.4,86,9.5,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,147,33,251.5,107,234.1,110,213.4,87,10.4,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,63,0,83.0,64,177.0,106,245.7,89,13.0,3,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,160,0,174.3,105,171.3,107,220.8,131,8.3,2,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,97,0,209.2,134,0.0,0,175.4,94,11.8,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,197,0,233.9,96,218.9,111,182.9,109,9.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,77,0,175.5,86,205.1,78,245.2,100,17.8,3,4,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,189,0,227.8,124,169.5,112,201.1,91,5.6,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,124,0,244.6,89,188.8,80,206.0,114,11.3,4,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,72,21,138.1,113,260.1,83,135.4,118,8.2,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,0,1 +0,150,29,209.9,77,158.0,52,141.9,113,6.6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,69,0,167.5,76,242.1,92,101.2,103,11.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,69,0,183.4,85,237.6,100,228.0,94,9.0,5,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,131,26,292.9,101,199.7,97,255.3,127,13.8,7,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1 +0,88,0,215.6,115,216.2,85,171.3,65,11.8,1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,100,0,203.8,122,283.1,76,197.3,83,12.5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,182,0,176.1,90,174.9,106,234.7,134,9.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,36,0,175.1,144,216.9,69,243.7,146,9.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1,0,1,0 +1,134,0,296.0,93,226.4,117,246.8,98,12.3,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,112,0,193.3,96,264.1,123,128.6,115,9.1,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,27,0,193.8,102,118.9,104,135.9,124,9.2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,99,0,146.7,64,274.0,99,321.3,98,8.9,1,3,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,134,34,247.2,105,225.5,133,186.3,76,6.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,133,0,117.8,100,199.2,105,244.1,119,11.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,164,27,159.7,102,168.8,113,244.1,127,9.6,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,87,28,151.4,95,152.4,97,250.1,109,0.0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,131,0,112.8,133,199.4,116,142.7,105,10.1,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,130,0,216.2,106,363.7,86,126.7,123,16.9,2,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,160,29,164.6,121,262.8,108,123.8,131,15.2,4,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,54,0,159.8,99,264.0,64,115.7,70,9.7,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,83,0,227.9,78,207.5,115,211.7,100,12.1,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,41,0,143.6,117,152.4,108,194.4,110,8.6,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,64,0,124.1,117,192.8,108,162.9,84,6.4,5,3,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,89,0,111.2,101,122.1,94,180.8,85,12.6,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,162,0,217.6,87,279.0,71,250.7,65,10.4,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,108,35,169.8,136,173.7,101,214.6,105,9.5,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,106,0,213.9,95,151.9,70,260.1,124,12.2,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,86,0,150.8,85,295.9,88,247.2,104,12.5,1,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,152,0,101.2,122,141.6,87,198.5,124,7.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,56,0,91.1,90,179.3,115,300.7,89,11.9,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,81,0,250.6,85,187.9,50,120.3,131,7.8,5,1,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,136,31,143.1,88,236.6,65,227.8,120,11.4,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,107,0,206.9,79,262.4,117,149.3,69,10.7,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,190,0,150.9,86,268.6,129,179.9,73,14.7,1,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,0,206.2,84,256.4,138,117.1,91,9.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,104,0,76.4,116,115.6,74,226.3,94,9.4,3,3,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,90,0,114.4,122,127.7,154,253.1,109,10.1,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,125,0,126.7,113,155.5,131,206.2,112,14.4,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,81,37,237.1,76,264.2,125,271.3,120,7.9,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,85,0,197.2,97,211.7,115,210.1,133,8.3,4,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,13,31,265.3,94,147.6,95,259.3,117,12.9,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,78,0,109.5,105,286.1,90,247.6,113,4.9,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,83,0,221.4,103,231.8,103,122.5,100,9.8,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,81,24,130.1,117,196.0,61,139.3,123,11.4,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,125,0,106.1,95,157.6,113,192.5,69,8.1,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,64,0,174.5,98,180.2,103,179.0,89,10.7,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,123,0,209.4,49,237.4,117,239.2,98,9.8,11,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,140,0,149.8,134,164.4,98,294.7,124,8.1,2,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,87,0,256.2,105,160.7,102,249.4,80,7.4,2,4,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,136,0,92.4,109,219.0,115,212.6,80,12.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,87,0,153.3,106,224.5,117,273.4,152,8.9,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,149,0,69.1,117,136.3,100,181.7,53,6.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,58,20,194.5,110,213.7,89,236.6,92,9.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,77,17,204.9,84,201.0,102,219.7,97,11.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,63,21,151.5,99,147.8,89,210.4,114,10.0,4,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,30,0,162.3,96,244.0,122,180.1,89,9.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0,1,0 +0,167,0,207.6,88,132.4,63,255.2,98,14.1,5,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,48,0,240.0,88,141.0,117,128.9,137,7.1,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,115,0,268.0,115,153.6,106,232.3,65,17.0,1,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,123,0,159.1,94,241.6,119,202.4,120,6.5,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,77,28,135.9,117,244.5,102,207.5,74,11.5,3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,98,38,213.7,61,253.0,104,207.7,73,10.7,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,111,0,181.8,117,158.1,91,266.2,123,9.7,9,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,86,0,83.5,96,221.1,63,349.7,75,12.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,79,0,41.9,124,211.0,95,237.9,55,11.4,5,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,131,23,170.8,145,236.7,93,294.5,100,12.7,1,1,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,92,0,130.7,113,260.6,122,244.2,98,9.4,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,103,0,180.2,134,97.7,85,181.7,134,8.4,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1,0 +0,105,0,259.3,96,175.2,97,222.4,36,12.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,142,40,237.4,105,175.9,93,210.3,110,9.2,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,55,8,222.5,104,171.5,94,377.5,114,9.7,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +1,24,0,149.0,73,131.0,81,238.6,69,8.6,3,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,93,0,267.9,114,223.0,74,262.7,90,11.3,3,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,126,24,58.9,125,305.5,90,158.9,73,12.1,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +1,82,0,329.8,73,208.3,120,267.1,102,10.6,6,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +1,125,0,113.0,108,169.2,107,156.6,61,9.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,57,0,169.6,96,234.7,112,285.4,83,11.2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,68,34,160.0,72,184.5,119,208.3,101,6.1,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,104,0,138.7,100,215.4,58,164.3,98,4.9,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0,1,0 +0,1,0,123.8,113,236.2,77,73.2,81,3.7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,151,0,178.7,116,292.1,138,265.9,101,9.8,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,38,0,131.2,98,162.9,97,159.0,106,8.2,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,130,0,115.6,129,167.8,104,141.8,124,12.6,9,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,107,0,230.4,65,257.4,80,107.3,88,8.5,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,1,0,1,0 +0,24,29,236.3,105,190.8,114,129.0,105,7.2,2,3,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,37,20,264.7,81,216.5,110,210.7,102,10.4,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,34,0,180.6,65,280.4,99,292.4,105,5.0,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,38,0,149.0,92,49.2,78,163.3,93,13.9,11,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,103,0,141.3,123,253.6,87,215.8,96,6.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,49,0,237.8,92,208.9,119,167.8,86,15.6,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0 +0,67,0,138.9,65,208.9,109,232.4,82,9.2,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,0,1,0 +0,74,0,165.3,120,198.5,106,208.5,102,9.8,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,79,0,134.7,98,189.7,68,221.4,128,11.8,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,34,0,293.7,89,272.5,71,178.2,76,11.0,10,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,89,0,192.1,83,163.6,88,169.7,138,6.1,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,50,0,99.6,108,308.7,102,161.2,62,13.7,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,1,0 +0,37,0,191.4,116,167.4,99,216.5,112,14.0,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,158,0,155.9,123,224.2,112,221.0,116,8.6,8,2,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,164,25,219.1,88,151.5,99,50.1,60,14.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,43,0,135.8,125,163.2,88,229.8,106,12.6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,1,0,0,1,0,1,0 +0,104,0,182.9,113,239.6,85,229.8,104,5.5,4,2,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,104,0,144.5,107,180.5,85,226.0,94,17.0,6,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,72,0,177.1,97,184.7,105,174.1,94,8.0,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,83,0,117.9,101,160.4,92,235.3,150,11.4,10,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,54,0,236.3,91,152.8,130,160.3,98,11.2,8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +0,122,0,107.9,88,235.8,109,228.6,119,9.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,56,0,137.2,111,165.9,119,182.3,72,14.3,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,62,0,172.8,101,204.8,97,240.8,90,9.1,8,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,0,1,0,1,0 +0,134,0,183.8,111,123.5,92,160.7,105,6.1,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,200,0,128.2,87,133.2,105,177.6,123,11.2,2,1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,104,26,189.1,112,178.2,97,199.3,104,11.1,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,102,0,242.2,88,233.2,89,188.5,121,6.2,6,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,33,35,186.8,124,261.0,69,317.8,103,15.0,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,120,0,137.3,100,212.2,129,152.7,92,10.5,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,98,22,278.3,89,93.4,143,107.6,42,9.7,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,21,0,91.9,109,198.4,111,171.7,125,13.0,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0,1,0 +0,104,0,167.6,116,219.2,112,215.9,94,11.7,2,4,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,131,0,197.0,79,201.0,114,151.2,111,11.6,5,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,103,0,160.2,104,138.9,70,312.5,97,9.7,2,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,90,0,157.9,72,234.0,93,210.0,86,12.2,5,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,166,0,181.4,108,253.8,54,112.3,94,11.6,6,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +1,115,0,226.4,101,276.8,60,213.4,82,12.3,4,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,144,37,219.9,102,222.1,77,118.5,111,10.0,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,84,33,159.1,106,149.8,101,213.4,108,13.0,18,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,132,0,163.2,80,167.6,90,87.5,90,6.2,10,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,80,0,101.1,121,263.2,110,137.7,74,7.3,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,1,0 +0,135,0,154.4,130,203.8,90,158.7,59,11.8,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,91,0,129.9,112,173.3,83,247.2,130,11.2,3,3,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,1,0 +1,133,44,174.0,80,209.4,113,224.1,87,14.1,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,94,0,170.1,113,271.8,94,110.7,78,8.7,4,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +0,64,48,94.4,104,136.2,101,147.4,89,4.5,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,89,12,188.0,105,151.3,107,201.9,132,10.5,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +1,100,0,278.0,76,176.7,74,219.5,126,8.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,124,0,151.1,123,187.4,104,255.4,93,5.3,3,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,105,15,88.1,125,175.9,142,269.9,85,9.7,1,2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,1,0,0,1 +0,53,0,164.1,106,206.0,56,194.7,124,11.4,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,95,0,198.4,113,235.9,144,325.6,99,10.1,3,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,33,0,184.4,111,203.8,110,237.4,100,9.3,5,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,101,0,232.7,114,186.4,123,153.3,122,11.5,6,5,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,85,0,236.9,93,197.7,113,309.1,78,11.4,7,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,82,0,130.0,110,185.3,88,178.7,105,8.3,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +1,152,41,146.8,128,285.6,96,213.6,80,4.3,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1,0,1 +0,80,15,159.3,110,170.6,120,141.2,82,11.9,5,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,0,1 +0,167,0,219.1,100,242.9,90,168.9,101,10.1,4,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,1,1,0 +1,73,0,217.8,91,220.6,97,277.3,89,10.3,6,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,1,0,1,0,1,0 +0,53,32,131.2,63,227.4,125,178.9,105,12.8,2,2,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,91,0,189.3,100,239.3,107,89.7,89,9.9,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,127,0,182.3,124,169.9,110,184.0,116,9.3,3,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,100,32,125.2,123,230.9,101,192.0,106,12.6,9,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,65,0,111.0,51,219.8,84,202.0,89,4.4,14,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,21,0,146.0,78,109.7,79,247.4,108,6.8,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,88,17,219.5,78,222.1,94,188.3,92,16.1,5,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,0,1 +0,90,0,109.9,102,220.8,114,104.0,133,10.9,6,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,1,0 +0,138,0,127.1,102,247.7,106,207.7,75,5.0,3,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,1,1,0,1,0 +0,120,0,252.0,120,150.2,106,151.8,96,9.6,1,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,112,30,60.6,113,165.9,96,132.8,99,13.3,7,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 +0,70,0,197.3,91,305.8,81,171.0,105,6.7,6,1,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,1,0,1,0 +0,81,46,168.3,124,270.9,103,222.5,98,6.7,2,4,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,0,0,1 diff --git a/sagemaker_model_monitor/visualization/SageMaker-Model-Monitor-Visualize.ipynb b/sagemaker_model_monitor/visualization/SageMaker-Model-Monitor-Visualize.ipynb new file mode 100644 index 0000000000..c7dc056413 --- /dev/null +++ b/sagemaker_model_monitor/visualization/SageMaker-Model-Monitor-Visualize.ipynb @@ -0,0 +1,180 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# SageMaker Model Monitor - visualizing monitoring results\n", + "\n", + "\n", + "The prebuilt container from SageMaker computes a variety of statistics and evaluates constraints out of the box. This notebook demonstrates how you can visualize them. You can grab the ProcessingJob arn from the executions behind a MonitoringSchedule and use this notebook to visualize the results.\n", + "\n", + "Let's import some python libraries that will be helpful for visualization" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from IPython.display import HTML, display\n", + "import json\n", + "import os\n", + "import boto3\n", + "\n", + "import sagemaker\n", + "from sagemaker import session\n", + "from sagemaker.model_monitor import MonitoringExecution\n", + "from sagemaker.s3 import S3Downloader" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Get Utilities for Rendering\n", + "\n", + "The functions for plotting and rendering distribution statistics or constraint violations are implemented in a `utils` file so let's grab that." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "!wget https://raw.githubusercontent.com/awslabs/amazon-sagemaker-examples/master/sagemaker_model_monitor/visualization/utils.py\n", + "\n", + "import utils as mu" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Get Execution and Baseline details from Processing Job Arn\n", + "\n", + "Enter the ProcessingJob arn for an execution of a MonitoringSchedule below to get the result files associated with that execution" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "processing_job_arn = \"FILL-IN-PROCESSING-JOB-ARN\" " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "execution = MonitoringExecution.from_processing_arn(sagemaker_session=session.Session(), processing_job_arn=processing_job_arn)\n", + "exec_inputs = {inp['InputName']: inp for inp in execution.describe()['ProcessingInputs']}\n", + "exec_results = execution.output.destination" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "baseline_statistics_filepath = exec_inputs['baseline']['S3Input']['S3Uri'] if 'baseline' in exec_inputs else None\n", + "execution_statistics_filepath = os.path.join(exec_results, 'statistics.json')\n", + "violations_filepath = os.path.join(exec_results, 'constraint_violations.json')\n", + "\n", + "baseline_statistics = json.loads(S3Downloader.read_file(baseline_statistics_filepath)) if baseline_statistics_filepath is not None else None\n", + "execution_statistics = json.loads(S3Downloader.read_file(execution_statistics_filepath))\n", + "violations = json.loads(S3Downloader.read_file(violations_filepath))['violations']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Overview\n", + "\n", + "The code below shows the violations and constraichecks across all features in a simple table." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "mu.show_violation_df(baseline_statistics=baseline_statistics, latest_statistics=execution_statistics, violations=violations)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Distributions\n", + "\n", + "This section visualizes the distribution and renders the distribution statistics for all features" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "features = mu.get_features(execution_statistics)\n", + "feature_baselines = mu.get_features(baseline_statistics)" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "mu.show_distributions(features)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Execution Stats vs Baseline" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "mu.show_distributions(features, feature_baselines)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.7.3" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/sagemaker_model_monitor/visualization/utils.py b/sagemaker_model_monitor/visualization/utils.py new file mode 100644 index 0000000000..ee5efb2fcb --- /dev/null +++ b/sagemaker_model_monitor/visualization/utils.py @@ -0,0 +1,293 @@ +from enum import Enum +import warnings +import re +import numpy as np +import pandas as pd +from IPython.display import HTML, display +from matplotlib import pyplot as plt +plt.style.use('seaborn-muted') + + +##### TABLE + +def group_by_feature(baseline_statistics, latest_statistics, violations): + features = {} + # add baseline statistics + if baseline_statistics: + for baseline_feature in baseline_statistics['features']: + feature_name = baseline_feature['name'] + if feature_name not in features: + features[feature_name] = {} + features[feature_name]['baseline'] = baseline_feature + # add latest statistics + if latest_statistics: + for latest_feature in latest_statistics['features']: + feature_name = latest_feature['name'] + if feature_name not in features: + features[feature_name] = {} + features[feature_name]['latest'] = latest_feature + # add violations + if violations: + for violation in violations: + feature_name = violation['feature_name'] + if feature_name not in features: + features[feature_name] = {} + if 'violations' in features[feature_name]: + features[feature_name]['violations'] += [violation] + else: + features[feature_name]['violations'] = [violation] + return features + + +def violation_exists(feature, check_type): + if 'violations' in feature: + if check_type in set([v['constraint_check_type'] for v in feature['violations']]): + return True + return False + + +def create_data_type_df(feature_names, features): + columns = ['data_type'] + rows = [] + rows_style = [] + for feature_name in feature_names: + feature = features[feature_name] + latest = feature['latest']['inferred_type'] + violation = violation_exists(feature, 'data_type_check') + rows.append([latest]) + rows_style.append([violation]) + df = pd.DataFrame(rows, index=feature_names, columns=columns) + df_style = pd.DataFrame(rows_style, index=feature_names, columns=columns) + return df, df_style + + +def get_completeness(feature): + if feature['inferred_type'] in set(['Fractional', 'Integral']): + common = feature['numerical_statistics']['common'] + elif feature['inferred_type'] == 'String': + common = feature['string_statistics']['common'] + else: + raise ValueError('Unknown `inferred_type` {}.'.format(feature['inferred_type'])) + num_present = common['num_present'] + num_missing = common['num_missing'] + completeness = num_present / (num_present + num_missing) + return completeness + + +def create_completeness_df(feature_names, features): + columns = ['completeness'] + rows = [] + rows_style = [] + for feature_name in feature_names: + feature = features[feature_name] + latest = get_completeness(feature['latest']) + violation = violation_exists(feature, 'completeness_check') + rows.append([latest]) + rows_style.append([violation]) + df = pd.DataFrame(rows, index=feature_names, columns=columns) + df_style = pd.DataFrame(rows_style, index=feature_names, columns=columns) + return df, df_style + + +def get_baseline_drift(feature): + if 'violations' in feature: + for violation in feature['violations']: + if violation['constraint_check_type'] == 'baseline_drift_check': + desc = violation['description'] + matches = re.search('distance: (.+) exceeds', desc) + if matches: + match = matches.group(1) + return float(match) + return np.nan + + +def create_baseline_drift_df(feature_names, features): + columns = ['baseline_drift'] + rows = [] + rows_style = [] + for feature_name in feature_names: + feature = features[feature_name] + latest = get_baseline_drift(feature) + violation = violation_exists(feature, 'baseline_drift_check') + rows.append([latest]) + rows_style.append([violation]) + df = pd.DataFrame(rows, index=feature_names, columns=columns) + df_style = pd.DataFrame(rows_style, index=feature_names, columns=columns) + return df, df_style + + +def get_categorical_values(feature): + if 'violations' in feature: + for violation in feature['violations']: + if violation['constraint_check_type'] == 'categorical_values_check': + desc = violation['description'] + matches = re.search('Value: (.+) does not meet the constraint requirement!', desc) + if matches: + match = matches.group(1) + return float(match) + return np.nan + + +def create_categorical_values_df(feature_names, features): + columns = ['categorical_values'] + rows = [] + rows_style = [] + for feature_name in feature_names: + feature = features[feature_name] + latest = get_categorical_values(feature) + violation = violation_exists(feature, 'categorical_values_check') + rows.append([latest]) + rows_style.append([violation]) + df = pd.DataFrame(rows, index=feature_names, columns=columns) + df_style = pd.DataFrame(rows_style, index=feature_names, columns=columns) + return df, df_style + + +def create_violation_df(baseline_statistics, latest_statistics, violations): + features = group_by_feature(baseline_statistics, latest_statistics, violations) + feature_names = list(features.keys()) + feature_names.sort() + data_type_df, data_type_df_style = create_data_type_df(feature_names, features) + completeness_df, completeness_df_style = create_completeness_df(feature_names, features) + baseline_drift_df, baseline_drift_df_style = create_baseline_drift_df(feature_names, features) + categorical_values_df, categorical_values_df_style = create_categorical_values_df(feature_names, features) + df = pd.concat([data_type_df, completeness_df, baseline_drift_df, categorical_values_df], axis=1) + df_style = pd.concat([data_type_df_style, completeness_df_style, baseline_drift_df_style, categorical_values_df_style], axis=1) + return df, df_style + + +def style_violation_df(df, df_style): + + def all_white(df): + attr = 'background-color: white' + return pd.DataFrame(attr, index=df.index, columns=df.columns) + + def highlight_failed_row(df): + nonlocal df_style + df_style_cp = df_style.copy() + values = df_style_cp.values.any(axis=1, keepdims=True) * np.ones_like(df_style) + df_style_cp = pd.DataFrame(values, index=df.index, columns=df.columns) + df_style_cp = df_style_cp.replace(to_replace=True, value='background-color: #fff7dc') + df_style_cp = df_style_cp.replace(to_replace=False, value='') + return df_style_cp + + def highlight_failed(df): + nonlocal df_style + df_style_cp = df_style.copy() + df_style_cp = df_style_cp.replace(to_replace=True, value='background-color: orange') + df_style_cp = df_style_cp.replace(to_replace=False, value='') + return df_style_cp + + def style_percentage(value): + if np.isnan(value): + return 'N/A' + else: + return '{:.2%}'.format(value) + + for column_name in ['completeness', 'baseline_drift', 'categorical_values']: + df[column_name] = df[column_name].apply(style_percentage) + + return df.style\ + .apply(all_white, axis=None)\ + .apply(highlight_failed_row, axis=None)\ + .apply(highlight_failed, axis=None) + + +def show_violation_df(baseline_statistics, latest_statistics, violations): + violation_df, violation_df_style = create_violation_df(baseline_statistics, latest_statistics, violations) + return style_violation_df(violation_df, violation_df_style) + + +##### VISUALIZATION + +def get_features(raw_data): + return {feature['name']: feature for feature in raw_data['features']} + +def show_distributions(features, baselines=None): + string_features = [name for name, feature in features.items() if FeatureType(feature['inferred_type']) == FeatureType.STRING] + numerical_features = [name for name, feature in features.items() if name not in string_features] + numerical_table = pd.concat([_summary_stats(features[feat]) for feat in numerical_features], axis=0) if numerical_features else None + string_table = pd.concat([_summary_stats(features[feat]) for feat in string_features], axis=0) if string_features else None + if numerical_features: + display(HTML("

{msg}

".format(msg="Numerical Features"))) + display(numerical_table) + _display_charts(_get_charts(features, numerical_features, baselines)) + if string_features: + display(HTML("

{msg}

".format(msg="String Features"))) + display(string_table) + _display_charts(_get_charts(features, string_features, baselines), numerical=False) + +def _display_charts(chart_tables, ncols=5, numerical=True): + nrows = int(np.ceil(len(chart_tables)/ncols)) + fig, ax = plt.subplots(nrows, ncols, figsize=(20, 4*nrows)) + for i, chart_table in enumerate(chart_tables): + row, col = i//5, i%5 + curr_ax = ax[row][col] if nrows > 1 else ax[col] + opacity = 0.7 + if numerical: + c = chart_table[0].sort_values(by=["lower_bound"]) + c_width = c.upper_bound.values[0] - c.lower_bound.values[0] + pos_c = 0.5 * (c.upper_bound.values + c.lower_bound.values) + + else: + c = chart_table[0].sort_values(by=["frequency"], ascending=False).iloc[:10] if len(chart_table[0]) > 10 else chart_table[0].sort_values(by=["frequency"], ascending=False) + c_width = 0.35 + pos_c = np.arange(len(c.value.values)) + + curr_ax.bar(pos_c, c.frequency, c_width, label='collected', alpha=opacity) + + if len(chart_table) > 1: #also includes baseline stats info + if numerical: + b = chart_table[1].sort_values(by=["lower_bound"]) + b_width = b.upper_bound.values[0] - b.lower_bound.values[0] + pos_b = 0.5 * (b.upper_bound.values + b.lower_bound.values) + curr_ax.bar(pos_b, b.frequency, b_width, label='baseline', alpha=opacity) + + else: + b = c.merge(chart_table[1], how='left', on=['value']) + b_width = 0.35 + pos_b = np.arange(len(b.value.values)) + b_width + curr_ax.bar(pos_b, b.frequency_y, b_width, label='baseline', alpha=opacity) + + curr_ax.legend() + + if not numerical: + curr_ax.set_xticks(pos_c + c_width/2) + curr_ax.set_xticklabels([label[:10] if len(label) > 10 else label for label in c.value.values], ) + [(tick.set_rotation(90), tick.set_fontsize(8)) for tick in curr_ax.get_xticklabels()] + curr_ax.set_xlabel(c.key.values[0]) + plt.ylabel('Frequency') + if ncols*nrows != len(chart_tables): + [a.set_visible(False) for a in ax.flat[-(ncols*nrows-len(chart_tables)):]] + plt.show(); + +def _get_charts(features, feature_types, baselines=None): + charts = [(_extract_dist(features[feat]), _extract_dist(baselines[feat])) for feat in feature_types] if baselines is not None else [(_extract_dist(features[feat]),) for feat in feature_types] + return [chart for chart in charts if not chart[0].empty] + +def _extract_dist(feature_dict): + try: + stats_key = 'string_statistics' if FeatureType(feature_dict['inferred_type']) == FeatureType.STRING else 'numerical_statistics' + distribution_type = 'categorical' if FeatureType(feature_dict['inferred_type']) == FeatureType.STRING else 'kll' + table = pd.DataFrame(feature_dict[stats_key]['distribution'][distribution_type]['buckets']) + table['frequency'] = table['count']/table['count'].sum() + table['key'] = [feature_dict['name']]*len(table) + except KeyError: + table = pd.DataFrame() + return table + +def _summary_stats(feature_dict): + stats_key = 'string_statistics' if FeatureType(feature_dict['inferred_type']) == FeatureType.STRING else 'numerical_statistics' + common = pd.DataFrame(feature_dict[stats_key]['common'], index=[feature_dict['name']]) + specific = pd.DataFrame({k:v for k,v in feature_dict[stats_key].items() if k != 'common' and k != 'distribution'}, + index=[feature_dict['name']]) + return pd.concat([common, specific], axis=1) + + +class FeatureType(Enum): + INTEGRAL = "Integral" + FRACTIONAL = "Fractional" + STRING = "String" + UNKNOWN = "Unknown" + + \ No newline at end of file diff --git a/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/Dockerfile b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/Dockerfile new file mode 100644 index 0000000000..7ca67a3769 --- /dev/null +++ b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/Dockerfile @@ -0,0 +1,57 @@ +FROM openjdk:8-jre-slim + +RUN apt-get update +RUN apt-get install -y curl unzip python3 python3-setuptools python3-pip python-dev python3-dev python-psutil +RUN pip3 install py4j psutil==5.6.5 numpy==1.17.4 +RUN apt-get clean +RUN rm -rf /var/lib/apt/lists/* + +# http://blog.stuart.axelbrooke.com/python-3-on-spark-return-of-the-pythonhashseed +ENV PYTHONHASHSEED 0 +ENV PYTHONIOENCODING UTF-8 +ENV PIP_DISABLE_PIP_VERSION_CHECK 1 + +# Install Hadoop +ENV HADOOP_VERSION 3.0.0 +ENV HADOOP_HOME /usr/hadoop-$HADOOP_VERSION +ENV HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop +ENV PATH $PATH:$HADOOP_HOME/bin +RUN curl -sL --retry 3 \ + "http://archive.apache.org/dist/hadoop/common/hadoop-$HADOOP_VERSION/hadoop-$HADOOP_VERSION.tar.gz" \ + | gunzip \ + | tar -x -C /usr/ \ + && rm -rf $HADOOP_HOME/share/doc \ + && chown -R root:root $HADOOP_HOME + +# Install Spark +ENV SPARK_VERSION 2.4.4 +ENV SPARK_PACKAGE spark-${SPARK_VERSION}-bin-without-hadoop +ENV SPARK_HOME /usr/spark-${SPARK_VERSION} +ENV SPARK_DIST_CLASSPATH="$HADOOP_HOME/etc/hadoop/*:$HADOOP_HOME/share/hadoop/common/lib/*:$HADOOP_HOME/share/hadoop/common/*:$HADOOP_HOME/share/hadoop/hdfs/*:$HADOOP_HOME/share/hadoop/hdfs/lib/*:$HADOOP_HOME/share/hadoop/hdfs/*:$HADOOP_HOME/share/hadoop/yarn/lib/*:$HADOOP_HOME/share/hadoop/yarn/*:$HADOOP_HOME/share/hadoop/mapreduce/lib/*:$HADOOP_HOME/share/hadoop/mapreduce/*:$HADOOP_HOME/share/hadoop/tools/lib/*" +ENV PATH $PATH:${SPARK_HOME}/bin +RUN curl -sL --retry 3 \ + "https://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/${SPARK_PACKAGE}.tgz" \ + | gunzip \ + | tar x -C /usr/ \ + && mv /usr/$SPARK_PACKAGE $SPARK_HOME \ + && chown -R root:root $SPARK_HOME + +# Point Spark at proper python binary +ENV PYSPARK_PYTHON=/usr/bin/python3 + +# Setup Spark/Yarn/HDFS user as root +ENV PATH="/usr/bin:/opt/program:${PATH}" +ENV YARN_RESOURCEMANAGER_USER="root" +ENV YARN_NODEMANAGER_USER="root" +ENV HDFS_NAMENODE_USER="root" +ENV HDFS_DATANODE_USER="root" +ENV HDFS_SECONDARYNAMENODE_USER="root" + +# Set up bootstrapping program and Spark configuration +COPY program /opt/program +RUN chmod +x /opt/program/submit +COPY hadoop-config /opt/hadoop-config + +WORKDIR $SPARK_HOME + +ENTRYPOINT ["/opt/program/submit"] diff --git a/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/core-site.xml b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/core-site.xml new file mode 100644 index 0000000000..4266972657 --- /dev/null +++ b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/core-site.xml @@ -0,0 +1,21 @@ + + + + + + + fs.defaultFS + hdfs://nn_uri/ + NameNode URI + + + fs.s3a.aws.credentials.provider + com.amazonaws.auth.ContainerCredentialsProvider + AWS S3 credential provider + + + fs.AbstractFileSystem.s3a.imp + org.apache.hadoop.fs.s3a.S3A + s3a filesystem implementation + + diff --git a/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/hdfs-site.xml b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/hdfs-site.xml new file mode 100644 index 0000000000..6ccfb8fd23 --- /dev/null +++ b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/hdfs-site.xml @@ -0,0 +1,19 @@ + + + + + + + dfs.datanode.data.dir + file:///opt/amazon/hadoop/hdfs/datanode + Comma separated list of paths on the local filesystem of a DataNode where it should store its\ + blocks. + + + + dfs.namenode.name.dir + file:///opt/amazon/hadoop/hdfs/namenode + Path on the local filesystem where the NameNode stores the namespace and transaction logs per\ + sistently. + + diff --git a/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/spark-defaults.conf b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/spark-defaults.conf new file mode 100644 index 0000000000..f624b4d9eb --- /dev/null +++ b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/spark-defaults.conf @@ -0,0 +1,3 @@ +spark.driver.host=sd_host +spark.executor.memory=exec_mem +spark.executor.cores=exec_cores diff --git a/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/yarn-site.xml b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/yarn-site.xml new file mode 100644 index 0000000000..a2dca27ac1 --- /dev/null +++ b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/hadoop-config/yarn-site.xml @@ -0,0 +1,60 @@ + + + + + yarn.scheduler.minimum-allocation-mb + minimum_allocation_mb + Minimum limit of memory to allocate to each container request at the Resource Manager. + + + yarn.scheduler.maximum-allocation-mb + maximum_allocation_mb + Maximum limit of memory to allocate to each container request at the Resource Manager. + + + yarn.scheduler.minimum-allocation-vcores + minimum_allocation_vcores + The minimum allocation for every container request at the RM, in terms of virtual CPU cores. Requests lower than this won't take effect, and the specified value will get allocated the minimum. + + + yarn.scheduler.maximum-allocation-vcores + maximum_allocation_vcores + The maximum allocation for every container request at the RM, in terms of virtual CPU cores. Requests higher than this won't take effect, and will get capped to this value. + + + yarn.nodemanager.resource.memory-mb + memory_mb_total + Physical memory, in MB, to be made available to running containers + + + yarn.nodemanager.resource.cpu-vcores + cpu_vcores_total + Number of CPU cores that can be allocated for containers. + + + yarn.resourcemanager.hostname + rm_hostname + The hostname of the RM. + + + yarn.nodemanager.hostname + nm_hostname + The hostname of the NM. + + + yarn.nodemanager.vmem-pmem-ratio + 5 + Ratio between virtual memory to physical memory. + + + yarn.resourcemanager.am.max-attempts + 1 + The maximum number of application attempts. + + + yarn.nodemanager.env-whitelist + JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,YARN_HOME,AWS_CONTAINER_CREDENTIALS_RELATIVE_URI + Environment variable whitelist + + + diff --git a/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/program/bootstrap.py b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/program/bootstrap.py new file mode 100644 index 0000000000..6129b2ac6c --- /dev/null +++ b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/program/bootstrap.py @@ -0,0 +1,161 @@ +import os +import socket +import json +import psutil +import subprocess +import sys +import time +from shutil import copyfile + +HADOOP_CONFIG_PATH = '/opt/hadoop-config/' +HADOOP_PATH = '/usr/hadoop-3.0.0' +SPARK_PATH = '/usr/spark-2.4.4' + + +def copy_cluster_config(): + src =os.path.join(HADOOP_CONFIG_PATH, "hdfs-site.xml") + dst = HADOOP_PATH + '/etc/hadoop/hdfs-site.xml' + copyfile(src, dst) + + src = os.path.join(HADOOP_CONFIG_PATH, "core-site.xml") + dst= HADOOP_PATH + '/etc/hadoop/core-site.xml' + copyfile(src, dst) + + src = os.path.join(HADOOP_CONFIG_PATH, "yarn-site.xml") + dst= HADOOP_PATH + '/etc/hadoop/yarn-site.xml' + copyfile(src, dst) + + src = os.path.join(HADOOP_CONFIG_PATH, "spark-defaults.conf") + dst= SPARK_PATH + '/conf/spark-defaults.conf' + copyfile(src, dst) + + +def copy_aws_jars(): + src = HADOOP_PATH + "/share/hadoop/tools/lib/aws-java-sdk-bundle-1.11.199.jar" + dst = HADOOP_PATH + "/share/hadoop/common/lib/aws-java-sdk-bundle-1.11.199.jar" + copyfile(src, dst) + + src = HADOOP_PATH + "/share/hadoop/tools/lib/hadoop-aws-3.0.0.jar" + dst = HADOOP_PATH + "/share/hadoop/common/lib/hadoop-aws-3.0.0.jar" + copyfile(src, dst) + + +def get_resource_config(): + resource_config_path = '/opt/ml/config/resourceconfig.json' + with open(resource_config_path, 'r') as f: + return json.load(f) + + +def write_runtime_cluster_config(): + resource_config = get_resource_config() + master_host = resource_config['hosts'][0] + master_ip = get_ip_from_host(master_host) + current_host = resource_config['current_host'] + + core_site_file_path = HADOOP_PATH + "/etc/hadoop/core-site.xml" + yarn_site_file_path = HADOOP_PATH + "/etc/hadoop/yarn-site.xml" + + hadoop_env_file_path = HADOOP_PATH + "/etc/hadoop/hadoop-env.sh" + yarn_env_file_path = HADOOP_PATH + "/etc/hadoop/yarn-env.sh" + spark_conf_file_path = SPARK_PATH + "/conf/spark-defaults.conf" + + # Pass through environment variables to hadoop env + with open(hadoop_env_file_path, 'a') as hadoop_env_file: + hadoop_env_file.write("export JAVA_HOME=" + os.environ['JAVA_HOME'] + "\n") + hadoop_env_file.write("export SPARK_MASTER_HOST=" + master_ip + "\n") + hadoop_env_file.write("export AWS_CONTAINER_CREDENTIALS_RELATIVE_URI=" + os.environ.get('AWS_CONTAINER_CREDENTIALS_RELATIVE_URI', '') + "\n") + + # Add YARN log directory + with open(yarn_env_file_path, 'a') as yarn_env_file: + yarn_env_file.write("export YARN_LOG_DIR=/var/log/yarn/") + + # Configure ip address for name node + with open(core_site_file_path, 'r') as core_file: + file_data = core_file.read() + file_data = file_data.replace('nn_uri', master_ip) + with open(core_site_file_path, 'w') as core_file: + core_file.write(file_data) + + # Configure hostname for resource manager and node manager + with open(yarn_site_file_path, 'r') as yarn_file: + file_data = yarn_file.read() + file_data = file_data.replace('rm_hostname', master_ip) + file_data = file_data.replace('nm_hostname', current_host) + with open(yarn_site_file_path, 'w') as yarn_file: + yarn_file.write(file_data) + + # Configure yarn resource limitation + mem = int(psutil.virtual_memory().total/(1024*1024)) # total physical memory in mb + cores = psutil.cpu_count(logical=True) # vCPUs + + minimum_allocation_mb = '1' + maximum_allocation_mb = str(mem) + minimum_allocation_vcores = '1' + maximum_allocation_vcores = str(cores) + # Add some residual in memory due to rounding in memory allocation + memory_mb_total = str(mem+2048) + # Ensure core allocations + cpu_vcores_total = str(cores*16) + + with open(yarn_site_file_path, 'r') as yarn_file: + file_data = yarn_file.read() + file_data = file_data.replace('minimum_allocation_mb', minimum_allocation_mb) + file_data = file_data.replace('maximum_allocation_mb', maximum_allocation_mb) + file_data = file_data.replace('minimum_allocation_vcores', minimum_allocation_vcores) + file_data = file_data.replace('maximum_allocation_vcores', maximum_allocation_vcores) + file_data = file_data.replace('memory_mb_total', memory_mb_total) + file_data = file_data.replace('cpu_vcores_total', cpu_vcores_total) + with open(yarn_site_file_path, 'w') as yarn_file: + yarn_file.write(file_data) + + # Configure Spark defaults + with open(spark_conf_file_path, 'r') as spark_file: + file_data = spark_file.read() + file_data = file_data.replace('sd_host', master_ip) + file_data = file_data.replace('exec_mem', str(int((mem / 3)*2.2))+'m') + file_data = file_data.replace('exec_cores', str(min(5, cores-1))) + with open(spark_conf_file_path, 'w') as spark_file: + spark_file.write(file_data) + print("Finished Yarn configuration files setup.\n") + + +def start_daemons(): + resource_config = get_resource_config() + current_host = resource_config['current_host'] + master_host = resource_config['hosts'][0] + + cmd_namenode_format = HADOOP_PATH + '/bin/hdfs namenode -format -force' + cmd_start_dfs = HADOOP_PATH + '/sbin/start-dfs.sh' + cmd_start_namenode = HADOOP_PATH + '/sbin/hadoop-daemon.sh start namenode' + cmd_start_datanode = HADOOP_PATH + '/sbin/hadoop-daemon.sh start datanode' + cmd_start_nodemanager = HADOOP_PATH + '/sbin/yarn-daemon.sh start nodemanager' + cmd_start_yarn = HADOOP_PATH + '/sbin/start-yarn.sh' + + if current_host == master_host: + subprocess.call(cmd_namenode_format, shell=True) + subprocess.call(cmd_start_dfs, shell=True) + subprocess.call(cmd_start_namenode, shell=True) + subprocess.call(cmd_start_datanode, shell=True) + subprocess.call(cmd_start_yarn, shell=True) + else: + subprocess.call(cmd_start_datanode, shell=True) + subprocess.call(cmd_start_nodemanager, shell=True) + + +def get_ip_from_host(host_name): + IP_WAIT_TIME = 300 + counter = 0 + ip = '' + + while counter < IP_WAIT_TIME and ip == '': + try: + ip = socket.gethostbyname(host_name) + break + except: + counter += 1 + time.sleep(1) + + if counter == IP_WAIT_TIME and ip == '': + raise Exception("Exceeded max wait time of 300s for hostname resolution") + + return ip diff --git a/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/program/submit b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/program/submit new file mode 100644 index 0000000000..dad70fe40e --- /dev/null +++ b/sagemaker_processing/feature_transformation_with_sagemaker_processing/container/program/submit @@ -0,0 +1,94 @@ +#!/usr/bin/env python3 + +from __future__ import print_function + +import os +import sys +import subprocess +import traceback +import bootstrap +import signal +import sys +import time + +from shlex import quote + +prefix = '/opt/ml/processing/' + +jars_path = os.path.join(prefix, 'input/jars/') +application_jar_path = os.path.join(prefix, 'input/application_jar/') +script_path = os.path.join(prefix, 'input/code/') +py_files_path = os.path.join(prefix, 'input/py_files/') + + +def bootstrap_yarn(): + bootstrap.copy_aws_jars() + bootstrap.copy_cluster_config() + bootstrap.write_runtime_cluster_config() + bootstrap.start_daemons() + + +def spark_submit(): + try: + params = os.environ + cmd = ['bin/spark-submit', + '--master', + 'yarn', + '--deploy-mode', + 'client' + ] + + mode = params['mode'] + if mode == 'python': + if os.path.isdir(py_files_path): + py_files_list = [py_files_path + s for s in os.listdir(py_files_path)] + cmd.extend(['--py-files', ",".join(py_files_list)]) + + cmd.extend(sys.argv[1:]) + elif mode == 'jar': + main_class = params['main_class'] + jars_list = [jars_path + s for s in os.listdir(jars_path)] + + cmd.extend(['--class', main_class]) + cmd.extend(['--jars', ",".join(jars_list)]) + cmd.extend(sys.argv[1:]) + else: + print("Unrecognized mode", mode) + sys.exit(255) + + cmd_string = " ".join(quote(c) for c in cmd) + subprocess.run(cmd_string, check=True, shell=True) + except Exception as e: + # Write out error details, this will be returned as the ExitMessage in the job details + trc = traceback.format_exc() + with open('/opt/ml/output/message', 'w') as s: + s.write('Exception during processing: ' + str(e) + '\n' + trc) + # Printing this causes the exception to be in the processing job logs, as well. + print('Exception during processing: ' + str(e) + '\n' + trc, file=sys.stderr) + # A non-zero exit code causes the processing job to be marked as Failed. + sys.exit(255) + + +if __name__ == "__main__": + bootstrap_yarn() + + resource_config = bootstrap.get_resource_config() + master_host = resource_config['hosts'][0] + master_ip = bootstrap.get_ip_from_host(master_host) + current_host = resource_config['current_host'] + if current_host == master_host: + spark_submit() + # Spark app is complete, terminate the workers by putting an end of job file in hdfs + hosts = resource_config['hosts'] + for host in hosts: + if host != master_host: + subprocess.Popen(['hdfs', 'dfs', '-touchz', '/_END_OF_JOB']).wait() + time.sleep(60) + else: + # Worker nodes will sleep and wait for the end of job file to be written by the master + while True: + return_code = subprocess.Popen(['hdfs', 'dfs', '-stat', '/_END_OF_JOB'], stderr=subprocess.DEVNULL).wait() + if return_code == 0: + print("Received end of job signal, exiting...") + sys.exit(0) + time.sleep(5) diff --git a/sagemaker_processing/feature_transformation_with_sagemaker_processing/feature_transformation_with_sagemaker_processing.ipynb b/sagemaker_processing/feature_transformation_with_sagemaker_processing/feature_transformation_with_sagemaker_processing.ipynb new file mode 100644 index 0000000000..0bc3b88582 --- /dev/null +++ b/sagemaker_processing/feature_transformation_with_sagemaker_processing/feature_transformation_with_sagemaker_processing.ipynb @@ -0,0 +1,453 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Feature transformation with Amazon SageMaker Processing and SparkML\n", + "\n", + "Typically a machine learning (ML) process consists of few steps. First, gathering data with various ETL jobs, then pre-processing the data, featurizing the dataset by incorporating standard techniques or prior knowledge, and finally training an ML model using an algorithm.\n", + "\n", + "Often, distributed data processing frameworks such as Spark are used to pre-process data sets in order to prepare them for training. In this notebook we'll use Amazon SageMaker Processing, and leverage the power of Spark in a managed SageMaker environment to run our preprocessing workload. Then, we'll take our preprocessed dataset and train a regression model using XGBoost." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Contents\n", + "\n", + "1. [Objective](#Objective:-predict-the-age-of-an-Abalone-from-its-physical-measurement)\n", + "1. [Setup](#Setup)\n", + "1. [Using Amazon SageMaker Processing to execute a SparkML Job](#Using-Amazon-SageMaker-Processing-to-execute-a-SparkML-Job)\n", + " 1. [Downloading dataset and uploading to S3](#Downloading-dataset-and-uploading-to-S3)\n", + " 1. [Build a Spark container for running the preprocessing job](#Build-a-Spark-container-for-running-the-preprocessing-job)\n", + " 1. [Run the preprocessing job using Amazon SageMaker Processing](#Run-the-preprocessing-job-using-Amazon-SageMaker-Processing)\n", + " 1. [Inspect the preprocessed dataset](#Inspect-the-preprocessed-dataset)\n", + "1. [Train a regression model using the Amazon SageMaker XGBoost algorithm](#Train-a-regression-model-using-the-SageMaker-XGBoost-algorithm)\n", + " 1. [Retrieve the XGBoost algorithm image](#Retrieve-the-XGBoost-algorithm-image)\n", + " 1. [Set XGBoost model parameters and dataset details](#Set-XGBoost-model-parameters-and-dataset-details)\n", + " 1. [Train the XGBoost model](#Train-the-XGBoost-model)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Objective: predict the age of an Abalone from its physical measurement" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The dataset is available from [UCI Machine Learning](https://archive.ics.uci.edu/ml/datasets/abalone). The aim for this task is to determine age of an Abalone (a kind of shellfish) from its physical measurements. At the core, it's a regression problem. The dataset contains several features - `sex` (categorical), `length` (continuous), `diameter` (continuous), `height` (continuous), `whole_weight` (continuous), `shucked_weight` (continuous), `viscera_weight` (continuous), `shell_weight` (continuous) and `rings` (integer).Our goal is to predict the variable `rings` which is a good approximation for age (age is `rings` + 1.5). \n", + "\n", + "Use SparkML to process the dataset (apply one or many feature transformers) and upload the transformed dataset to Amazon S3 so that it can be used for training with XGBoost." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Setup\n", + "\n", + "Let's start by specifying:\n", + "* The S3 bucket and prefixes that you use for training and model data. Use the default bucket specified by the Amazon SageMaker session.\n", + "* The IAM role ARN used to give processing and training access to the dataset." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import sagemaker\n", + "from time import gmtime, strftime\n", + "\n", + "sagemaker_session = sagemaker.Session()\n", + "role = sagemaker.get_execution_role()\n", + "bucket = sagemaker_session.default_bucket()\n", + "\n", + "timestamp_prefix = strftime(\"%Y-%m-%d-%H-%M-%S\", gmtime())\n", + "\n", + "prefix = 'sagemaker/spark-preprocess-demo/' + timestamp_prefix\n", + "input_prefix = prefix + '/input/raw/abalone'\n", + "input_preprocessed_prefix = prefix + '/input/preprocessed/abalone'\n", + "model_prefix = prefix + '/model'" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Using Amazon SageMaker Processing to execute a SparkML job" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Downloading dataset and uploading to Amazon Simple Storage Service (Amazon S3)\n", + "\n", + "The Amazon SageMaker team downloaded the abalone dataset from the University of California, Irvine repository and uploaded it to an S3 buckets. In this notebook, you download from that bucket and upload to your own bucket so that Amazon SageMaker can access the dataset." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "# Fetch the dataset from the SageMaker bucket\n", + "!wget https://s3-us-west-2.amazonaws.com/sparkml-mleap/data/abalone/abalone.csv\n", + "\n", + "# Uploading the training data to S3\n", + "sagemaker_session.upload_data(path='abalone.csv', bucket=bucket, key_prefix=input_prefix)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Build a Spark container for running the preprocessing job\n", + "\n", + "An example Spark container is included in the `./container` directory of this example. The container handles the bootstrapping of all Spark configuration, and serves as a wrapper around the `spark-submit` CLI. At a high level the container provides:\n", + "* A set of default Spark/YARN/Hadoop configurations\n", + "* A bootstrapping script for configuring and starting up Spark master/worker nodes\n", + "* A wrapper around the `spark-submit` CLI to submit a Spark application\n", + "\n", + "\n", + "After the container build and push process is complete, use the Amazon SageMaker Python SDK to submit a managed, distributed Spark application that performs our dataset preprocessing.\n", + "\n", + "Build the example Spark container." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%cd container\n", + "!docker build -t sagemaker-spark-example .\n", + "%cd ../" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Create an Amazon Elastic Container Registry (Amazon ECR) repository for the Spark container and push the image." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "account_id = boto3.client('sts').get_caller_identity().get('Account')\n", + "region = boto3.session.Session().region_name\n", + "\n", + "ecr_repository = 'sagemaker-spark-example'\n", + "tag = ':latest'\n", + "spark_repository_uri = '{}.dkr.ecr.{}.amazonaws.com/{}'.format(account_id, region, ecr_repository + tag)\n", + "\n", + "# Create ECR repository and push docker image\n", + "!$(aws ecr get-login --region $region --registry-ids $account_id --no-include-email)\n", + "!aws ecr create-repository --repository-name $ecr_repository\n", + "!docker tag {ecr_repository + tag} $spark_repository_uri\n", + "!docker push $spark_repository_uri" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Run the preprocessing job using Amazon SageMaker Processing\n", + "\n", + "Next, use the Amazon SageMaker Python SDK to submit a processing job. Use the Spark container that was just built, and a SparkML script for preprocessing in the job configuration." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Create the SparkML preprocessing script." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%writefile preprocess.py\n", + "from __future__ import print_function\n", + "from __future__ import unicode_literals\n", + "\n", + "import time\n", + "import sys\n", + "import os\n", + "import shutil\n", + "import csv\n", + "\n", + "import pyspark\n", + "from pyspark.sql import SparkSession\n", + "from pyspark.ml import Pipeline\n", + "from pyspark.sql.types import StructField, StructType, StringType, DoubleType\n", + "from pyspark.ml.feature import StringIndexer, VectorIndexer, OneHotEncoder, VectorAssembler\n", + "from pyspark.sql.functions import *\n", + "\n", + "\n", + "def csv_line(data):\n", + " r = ','.join(str(d) for d in data[1])\n", + " return str(data[0]) + \",\" + r\n", + "\n", + "\n", + "def main():\n", + " spark = SparkSession.builder.appName(\"PySparkAbalone\").getOrCreate()\n", + " \n", + " # Convert command line args into a map of args\n", + " args_iter = iter(sys.argv[1:])\n", + " args = dict(zip(args_iter, args_iter))\n", + " \n", + " # This is needed to save RDDs which is the only way to write nested Dataframes into CSV format\n", + " spark.sparkContext._jsc.hadoopConfiguration().set(\"mapred.output.committer.class\",\n", + " \"org.apache.hadoop.mapred.FileOutputCommitter\")\n", + " \n", + " # Defining the schema corresponding to the input data. The input data does not contain the headers\n", + " schema = StructType([StructField(\"sex\", StringType(), True), \n", + " StructField(\"length\", DoubleType(), True),\n", + " StructField(\"diameter\", DoubleType(), True),\n", + " StructField(\"height\", DoubleType(), True),\n", + " StructField(\"whole_weight\", DoubleType(), True),\n", + " StructField(\"shucked_weight\", DoubleType(), True),\n", + " StructField(\"viscera_weight\", DoubleType(), True), \n", + " StructField(\"shell_weight\", DoubleType(), True), \n", + " StructField(\"rings\", DoubleType(), True)])\n", + "\n", + " # Downloading the data from S3 into a Dataframe\n", + " total_df = spark.read.csv(('s3a://' + os.path.join(args['s3_input_bucket'], args['s3_input_key_prefix'],\n", + " 'abalone.csv')), header=False, schema=schema)\n", + "\n", + " #StringIndexer on the sex column which has categorical value\n", + " sex_indexer = StringIndexer(inputCol=\"sex\", outputCol=\"indexed_sex\")\n", + " \n", + " #one-hot-encoding is being performed on the string-indexed sex column (indexed_sex)\n", + " sex_encoder = OneHotEncoder(inputCol=\"indexed_sex\", outputCol=\"sex_vec\")\n", + "\n", + " #vector-assembler will bring all the features to a 1D vector for us to save easily into CSV format\n", + " assembler = VectorAssembler(inputCols=[\"sex_vec\", \n", + " \"length\", \n", + " \"diameter\", \n", + " \"height\", \n", + " \"whole_weight\", \n", + " \"shucked_weight\", \n", + " \"viscera_weight\", \n", + " \"shell_weight\"], \n", + " outputCol=\"features\")\n", + " \n", + " # The pipeline comprises of the steps added above\n", + " pipeline = Pipeline(stages=[sex_indexer, sex_encoder, assembler])\n", + " \n", + " # This step trains the feature transformers\n", + " model = pipeline.fit(total_df)\n", + " \n", + " # This step transforms the dataset with information obtained from the previous fit\n", + " transformed_total_df = model.transform(total_df)\n", + " \n", + " # Split the overall dataset into 80-20 training and validation\n", + " (train_df, validation_df) = transformed_total_df.randomSplit([0.8, 0.2])\n", + " \n", + " # Convert the train dataframe to RDD to save in CSV format and upload to S3\n", + " train_rdd = train_df.rdd.map(lambda x: (x.rings, x.features))\n", + " train_lines = train_rdd.map(csv_line)\n", + " train_lines.saveAsTextFile('s3a://' + os.path.join(args['s3_output_bucket'], args['s3_output_key_prefix'], 'train'))\n", + " \n", + " # Convert the validation dataframe to RDD to save in CSV format and upload to S3\n", + " validation_rdd = validation_df.rdd.map(lambda x: (x.rings, x.features))\n", + " validation_lines = validation_rdd.map(csv_line)\n", + " validation_lines.saveAsTextFile('s3a://' + os.path.join(args['s3_output_bucket'], args['s3_output_key_prefix'], 'validation'))\n", + "\n", + "\n", + "if __name__ == \"__main__\":\n", + " main()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Run a processing job using the Docker image and preprocessing script you just created. When invoking the `spark_processor.run()` function, pass the Amazon S3 input and output paths as arguments that are required by our preprocessing script to determine input and output location in Amazon S3. Here, you also specify the number of instances and instance type that will be used for the distributed Spark job." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.processing import ScriptProcessor, ProcessingInput\n", + "spark_processor = ScriptProcessor(base_job_name='spark-preprocessor',\n", + " image_uri=spark_repository_uri,\n", + " command=['/opt/program/submit'],\n", + " role=role,\n", + " instance_count=2,\n", + " instance_type='ml.r5.xlarge',\n", + " max_runtime_in_seconds=1200,\n", + " env={'mode': 'python'})\n", + "\n", + "spark_processor.run(code='preprocess.py',\n", + " arguments=['s3_input_bucket', bucket,\n", + " 's3_input_key_prefix', input_prefix,\n", + " 's3_output_bucket', bucket,\n", + " 's3_output_key_prefix', input_preprocessed_prefix],\n", + " logs=False)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Inspect the preprocessed dataset\n", + "Take a look at a few rows of the transformed dataset to make sure the preprocessing was successful." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "print('Top 5 rows from s3://{}/{}/train/'.format(bucket, input_preprocessed_prefix))\n", + "!aws s3 cp --quiet s3://$bucket/$input_preprocessed_prefix/train/part-00000 - | head -n5" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Train a regression model using the SageMaker XGBoost algorithm\n", + "\n", + "Use Amazon SageMaker XGBoost algorithm to train on this dataset. You already know the Amazon S3 location where the preprocessed training data was uploaded as part of the processing job output." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Retrieve the XGBoost algorithm image\n", + "\n", + "Retrieve the XGBoost built-in algorithm image so that you can use it in the training job." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from sagemaker.amazon.amazon_estimator import get_image_uri\n", + "\n", + "training_image = get_image_uri(sagemaker_session.boto_region_name, 'xgboost', repo_version=\"0.90-1\")\n", + "print(training_image)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Set XGBoost model parameters and dataset details\n", + "\n", + "Next, configure an Estimator for the XGBoost algorithm and the input dataset. The notebook is parameterized so that the same data location used in the SparkML script can now be passed to XGBoost Estimator as well." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "s3_train_data = 's3://{}/{}/{}'.format(bucket, input_preprocessed_prefix, 'train/part')\n", + "s3_validation_data = 's3://{}/{}/{}'.format(bucket, input_preprocessed_prefix, 'validation/part')\n", + "s3_output_location = 's3://{}/{}/{}'.format(bucket, prefix, 'xgboost_model')\n", + "\n", + "xgb_model = sagemaker.estimator.Estimator(training_image,\n", + " role, \n", + " train_instance_count=1, \n", + " train_instance_type='ml.m4.xlarge',\n", + " train_volume_size = 20,\n", + " train_max_run = 3600,\n", + " input_mode= 'File',\n", + " output_path=s3_output_location,\n", + " sagemaker_session=sagemaker_session)\n", + "\n", + "xgb_model.set_hyperparameters(objective = \"reg:linear\",\n", + " eta = .2,\n", + " gamma = 4,\n", + " max_depth = 5,\n", + " num_round = 10,\n", + " subsample = 0.7,\n", + " silent = 0,\n", + " min_child_weight = 6)\n", + "\n", + "train_data = sagemaker.session.s3_input(s3_train_data, distribution='FullyReplicated', \n", + " content_type='text/csv', s3_data_type='S3Prefix')\n", + "validation_data = sagemaker.session.s3_input(s3_validation_data, distribution='FullyReplicated', \n", + " content_type='text/csv', s3_data_type='S3Prefix')\n", + "\n", + "data_channels = {'train': train_data, 'validation': validation_data}" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Train the XGBoost model" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "xgb_model.fit(inputs=data_channels, logs=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Summary\n", + "\n", + "Voila! You completed the first portion of the machine learning pipeline using Amazon SageMaker Processing for feature transformation and Amazon SageMaker XGBoost for training a regression model." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "conda_python3", + "language": "python", + "name": "conda_python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.5" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/sagemaker_processing/scikit_learn_data_processing_and_model_evaluation/Processing-1.jpg b/sagemaker_processing/scikit_learn_data_processing_and_model_evaluation/Processing-1.jpg new file mode 100644 index 0000000000..1cd4ab5e28 Binary files /dev/null and b/sagemaker_processing/scikit_learn_data_processing_and_model_evaluation/Processing-1.jpg differ diff --git a/sagemaker_processing/scikit_learn_data_processing_and_model_evaluation/README.md b/sagemaker_processing/scikit_learn_data_processing_and_model_evaluation/README.md new file mode 100644 index 0000000000..3d2af909b2 --- /dev/null +++ b/sagemaker_processing/scikit_learn_data_processing_and_model_evaluation/README.md @@ -0,0 +1,14 @@ +### Scikit-Learn Data Processing and Model Evaluation + + +This notebook shows how you can: + +- run a processing job to run a Scikit-Learn script to clean, pre-process, perform feature engineering, and split the input data into train and test sets. +- run a training job on the pre-processed training data to train a model model +- run a processing job on the pre-processed test data to evaluate the trained model's performance +- use your own custom container with to run processing jobs with your own Python libraries and dependencies. + +The dataset used is the [Census-Income KDD Dataset](https://archive.ics.uci.edu/ml/datasets/Census-Income+%28KDD%29). We will select features from this dataset, clean the data, and turn the data into features that our training algorithm can use to train a binary classification model, and split the data into train and test sets. + +The task is to predict whether rows representing census responders have an income greater than `$50K`, or less than `50K`. The dataset is heavily class imbalanced, with most records being labeled as earning less than `$50K`. After training a logistic regression model, we will evaluate the model against a hold-out test dataset, and save the classification evaluation metrics, including precision, recall, and F1 score for each label, and accuracy and ROC AUC for the model. + diff --git a/sagemaker_processing/scikit_learn_data_processing_and_model_evaluation/scikit_learn_data_processing_and_model_evaluation.ipynb b/sagemaker_processing/scikit_learn_data_processing_and_model_evaluation/scikit_learn_data_processing_and_model_evaluation.ipynb new file mode 100644 index 0000000000..1c12fa5419 --- /dev/null +++ b/sagemaker_processing/scikit_learn_data_processing_and_model_evaluation/scikit_learn_data_processing_and_model_evaluation.ipynb @@ -0,0 +1,563 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Amazon SageMaker Processing jobs\n", + "\n", + "With Amazon SageMaker Processing jobs, you can leverage a simplified, managed experience to run data pre- or post-processing and model evaluation workloads on the Amazon SageMaker platform.\n", + "\n", + "A processing job downloads input from Amazon Simple Storage Service (Amazon S3), then uploads outputs to Amazon S3 during or after the processing job.\n", + "\n", + "\n", + "\n", + "This notebook shows how you can:\n", + "\n", + "1. Run a processing job to run a scikit-learn script that cleans, pre-processes, performs feature engineering, and splits the input data into train and test sets.\n", + "2. Run a training job on the pre-processed training data to train a model\n", + "3. Run a processing job on the pre-processed test data to evaluate the trained model's performance\n", + "4. Use your own custom container to run processing jobs with your own Python libraries and dependencies.\n", + "\n", + "The dataset used here is the [Census-Income KDD Dataset](https://archive.ics.uci.edu/ml/datasets/Census-Income+%28KDD%29). You select features from this dataset, clean the data, and turn the data into features that the training algorithm can use to train a binary classification model, and split the data into train and test sets. The task is to predict whether rows representing census responders have an income greater than `$50,000`, or less than `$50,000`. The dataset is heavily class imbalanced, with most records being labeled as earning less than `$50,000`. After training a logistic regression model, you evaluate the model against a hold-out test dataset, and save the classification evaluation metrics, including precision, recall, and F1 score for each label, and accuracy and ROC AUC for the model." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Data pre-processing and feature engineering" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To run the scikit-learn preprocessing script as a processing job, create a `SKLearnProcessor`, which lets you run scripts inside of processing jobs using the scikit-learn image provided." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import boto3\n", + "import sagemaker\n", + "from sagemaker import get_execution_role\n", + "from sagemaker.sklearn.processing import SKLearnProcessor\n", + "\n", + "region = boto3.session.Session().region_name\n", + "\n", + "role = get_execution_role()\n", + "sklearn_processor = SKLearnProcessor(framework_version='0.20.0',\n", + " role=role,\n", + " instance_type='ml.m5.xlarge',\n", + " instance_count=1)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Before introducing the script you use for data cleaning, pre-processing, and feature engineering, inspect the first 20 rows of the dataset. The target is predicting the `income` category. The features from the dataset you select are `age`, `education`, `major industry code`, `class of worker`, `num persons worked for employer`, `capital gains`, `capital losses`, and `dividends from stocks`." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "import pandas as pd\n", + "\n", + "input_data = 's3://sagemaker-sample-data-{}/processing/census/census-income.csv'.format(region)\n", + "df = pd.read_csv(input_data, nrows=10)\n", + "df.head(n=10)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This notebook cell writes a file `preprocessing.py`, which contains the pre-processing script. You can update the script, and rerun this cell to overwrite `preprocessing.py`. You run this as a processing job in the next cell. In this script, you\n", + "\n", + "* Remove duplicates and rows with conflicting data\n", + "* transform the target `income` column into a column containing two labels.\n", + "* transform the `age` and `num persons worked for employer` numerical columns into categorical features by binning them\n", + "* scale the continuous `capital gains`, `capital losses`, and `dividends from stocks` so they're suitable for training\n", + "* encode the `education`, `major industry code`, `class of worker` so they're suitable for training\n", + "* split the data into training and test datasets, and saves the training features and labels and test features and labels.\n", + "\n", + "Our training script will use the pre-processed training features and labels to train a model, and our model evaluation script will use the trained model and pre-processed test features and labels to evaluate the model." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "%%writefile preprocessing.py\n", + "\n", + "import argparse\n", + "import os\n", + "import warnings\n", + "\n", + "import pandas as pd\n", + "import numpy as np\n", + "from sklearn.model_selection import train_test_split\n", + "from sklearn.preprocessing import StandardScaler, OneHotEncoder, LabelBinarizer, KBinsDiscretizer\n", + "from sklearn.preprocessing import PolynomialFeatures\n", + "from sklearn.compose import make_column_transformer\n", + "\n", + "from sklearn.exceptions import DataConversionWarning\n", + "warnings.filterwarnings(action='ignore', category=DataConversionWarning)\n", + "\n", + "\n", + "columns = ['age', 'education', 'major industry code', 'class of worker', 'num persons worked for employer',\n", + " 'capital gains', 'capital losses', 'dividends from stocks', 'income']\n", + "class_labels = [' - 50000.', ' 50000+.']\n", + "\n", + "def print_shape(df):\n", + " negative_examples, positive_examples = np.bincount(df['income'])\n", + " print('Data shape: {}, {} positive examples, {} negative examples'.format(df.shape, positive_examples, negative_examples))\n", + "\n", + "if __name__=='__main__':\n", + " parser = argparse.ArgumentParser()\n", + " parser.add_argument('--train-test-split-ratio', type=float, default=0.3)\n", + " args, _ = parser.parse_known_args()\n", + " \n", + " print('Received arguments {}'.format(args))\n", + "\n", + " input_data_path = os.path.join('/opt/ml/processing/input', 'census-income.csv')\n", + " \n", + " print('Reading input data from {}'.format(input_data_path))\n", + " df = pd.read_csv(input_data_path)\n", + " df = pd.DataFrame(data=df, columns=columns)\n", + " df.dropna(inplace=True)\n", + " df.drop_duplicates(inplace=True)\n", + " df.replace(class_labels, [0, 1], inplace=True)\n", + " \n", + " negative_examples, positive_examples = np.bincount(df['income'])\n", + " print('Data after cleaning: {}, {} positive examples, {} negative examples'.format(df.shape, positive_examples, negative_examples))\n", + " \n", + " split_ratio = args.train_test_split_ratio\n", + " print('Splitting data into train and test sets with ratio {}'.format(split_ratio))\n", + " X_train, X_test, y_train, y_test = train_test_split(df.drop('income', axis=1), df['income'], test_size=split_ratio, random_state=0)\n", + "\n", + " preprocess = make_column_transformer(\n", + " (['age', 'num persons worked for employer'], KBinsDiscretizer(encode='onehot-dense', n_bins=10)),\n", + " (['capital gains', 'capital losses', 'dividends from stocks'], StandardScaler()),\n", + " (['education', 'major industry code', 'class of worker'], OneHotEncoder(sparse=False))\n", + " )\n", + " print('Running preprocessing and feature engineering transformations')\n", + " train_features = preprocess.fit_transform(X_train)\n", + " test_features = preprocess.transform(X_test)\n", + " \n", + " print('Train data shape after preprocessing: {}'.format(train_features.shape))\n", + " print('Test data shape after preprocessing: {}'.format(test_features.shape))\n", + " \n", + " train_features_output_path = os.path.join('/opt/ml/processing/train', 'train_features.csv')\n", + " train_labels_output_path = os.path.join('/opt/ml/processing/train', 'train_labels.csv')\n", + " \n", + " test_features_output_path = os.path.join('/opt/ml/processing/test', 'test_features.csv')\n", + " test_labels_output_path = os.path.join('/opt/ml/processing/test', 'test_labels.csv')\n", + " \n", + " print('Saving training features to {}'.format(train_features_output_path))\n", + " pd.DataFrame(train_features).to_csv(train_features_output_path, header=False, index=False)\n", + " \n", + " print('Saving test features to {}'.format(test_features_output_path))\n", + " pd.DataFrame(test_features).to_csv(test_features_output_path, header=False, index=False)\n", + " \n", + " print('Saving training labels to {}'.format(train_labels_output_path))\n", + " y_train.to_csv(train_labels_output_path, header=False, index=False)\n", + " \n", + " print('Saving test labels to {}'.format(test_labels_output_path))\n", + " y_test.to_csv(test_labels_output_path, header=False, index=False)\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Run this script as a processing job. Use the `SKLearnProcessor.run()` method. You give the `run()` method one `ProcessingInput` where the `source` is the census dataset in Amazon S3, and the `destination` is where the script reads this data from, in this case `/opt/ml/processing/input`. These local paths inside the processing container must begin with `/opt/ml/processing/`.\n", + "\n", + "Also give the `run()` method a `ProcessingOutput`, where the `source` is the path the script writes output data to. For outputs, the `destination` defaults to an S3 bucket that the Amazon SageMaker Python SDK creates for you, following the format `s3://sagemaker--//output/