Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated: Markdown in several examples #55

Merged
merged 1 commit into from
Nov 26, 2017
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -30,17 +30,20 @@
"\n",
"## Setup\n",
"\n",
"### Prerequisites\n",
"\n",
"In order to successfully run this notebook, you must first:\n",
"\n",
"1. Have an existing KMS key from AWS IAM console or create one ([learn more](http://docs.aws.amazon.com/kms/latest/developerguide/create-keys.html)).\n",
"2. Allow the IAM role used for SageMaker to encrypt and decrypt data with this key from within applications and when using AWS services integrated with KMS ([learn more](http://docs.aws.amazon.com/console/kms/key-users)).\n",
"\n",
"We use the `key-id` from the KMS key ARN `arn:aws:kms:region:acct-id:key/key-id`.\n",
"\n",
"### General Setup\n",
"Let's start by specifying:\n",
"* AWS region.\n",
"* The IAM role arn used to give learning and hosting access to your data. See the documentation for how to specify these.\n",
"* The S3 bucket that you want to use for training and model data.\n",
"\n",
"### KMS key setup\n",
"1. Use an existing KMS key from AWS IAM console or create one ([learn more](http://docs.aws.amazon.com/kms/latest/developerguide/create-keys.html)).\n",
"2. Allow the IAM role used for SageMaker to encrypt and decrypt data with this key from within applications and when using AWS services integrated with KMS ([learn more](http://docs.aws.amazon.com/console/kms/key-users)).\n",
"\n",
"We use the `key-id` from the KMS key ARN `arn:aws:kms:region:acct-id:key/key-id`."
"* The S3 bucket that you want to use for training and model data."
]
},
{
Expand All @@ -65,9 +68,9 @@
"assumed_role = boto3.client('sts').get_caller_identity()['Arn']\n",
"role = re.sub(r'^(.+)sts::(\\d+):assumed-role/(.+?)/.*$', r'\\1iam::\\2:role/\\3', assumed_role)\n",
"\n",
"kms_key_id = '<bring your own key-id>'\n",
"kms_key_id = '<your_kms_key_arn_here>'\n",
"\n",
"bucket='<s3 bucket>' # put your s3 bucket name here, and create s3 bucket\n",
"bucket='<your_s3_bucket_name_here>' # put your s3 bucket name here, and create s3 bucket\n",
"prefix = 'sagemarker/kms-new'\n",
"# customize to your bucket where you have stored the data\n",
"bucket_path = 'https://s3-{}.amazonaws.com/{}'.format(region,bucket)"
Expand Down Expand Up @@ -98,7 +101,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"from sklearn.datasets import load_boston\n",
Expand All @@ -125,7 +130,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"from sklearn.model_selection import train_test_split\n",
Expand All @@ -136,7 +143,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"def write_file(X, y, fname):\n",
Expand All @@ -154,7 +163,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"train_file = 'train.csv'\n",
Expand Down Expand Up @@ -217,7 +228,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"containers = {'us-west-2': '433757028032.dkr.ecr.us-west-2.amazonaws.com/xgboost:latest',\n",
Expand Down Expand Up @@ -430,7 +443,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"runtime_client = boto3.client('sagemaker-runtime')"
Expand All @@ -439,7 +454,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"import sys\n",
Expand Down Expand Up @@ -513,15 +530,8 @@
"metadata": {},
"outputs": [],
"source": [
"client.delete_endpoint(EndpointName=endpoint_name)"
"#client.delete_endpoint(EndpointName=endpoint_name)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand All @@ -540,7 +550,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.3"
"version": "3.6.2"
},
"notice": "Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the \"License\"). You may not use this file except in compliance with the License. A copy of the License is located at http://aws.amazon.com/apache2.0/ or in the \"license\" file accompanying this file. This file is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License."
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,16 @@
"\n",
"\n",
"## Introduction\n",
"In this notebook we illustrate how to copy data from Redshift to S3 and vice-versa. We have a Redshift cluster within the same VPC, and have preloaded it with data from the [iris data set](https://archive.ics.uci.edu/ml/datasets/iris). Let's start by installing `psycopg2`, a PostgreSQL database adapter for the Python, adding a few imports and specifying a few configs. "
"In this notebook we illustrate how to copy data from Redshift to S3 and vice-versa.\n",
"\n",
"### Prerequisites\n",
"In order to successfully run this notebook, you'll first need to:\n",
"1. Have a Redshift cluster within the same VPC.\n",
"1. Preload that cluster with data from the [iris data set](https://archive.ics.uci.edu/ml/datasets/iris) in a table named public.irisdata.\n",
"1. Update the credential file (`redshift_creds_template.json.nogit`) file with the appropriate information.\n",
"\n",
"### Notebook Setup\n",
"Let's start by installing `psycopg2`, a PostgreSQL database adapter for the Python, adding a few imports and specifying a few configs. "
]
},
{
Expand All @@ -35,6 +44,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true,
"isConfigCell": true
},
"outputs": [],
Expand All @@ -48,7 +58,7 @@
"\n",
"region = boto3.Session().region_name\n",
"\n",
"bucket='<S3 bucket>' # put your s3 bucket name here, and create s3 bucket\n",
"bucket='<your_s3_bucket_name_here>' # put your s3 bucket name here, and create s3 bucket\n",
"prefix = 'sagemarker/redshift'\n",
"# customize to your bucket where you have stored the data\n",
"\n",
Expand All @@ -67,6 +77,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true,
"isConfigCell": true
},
"outputs": [],
Expand All @@ -89,7 +100,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"print(\"Reading from Redshift...\")\n",
Expand Down Expand Up @@ -130,7 +143,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"print(\"Writing to S3...\")\n",
Expand All @@ -152,7 +167,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"print(\"Reading from S3...\")\n",
Expand All @@ -169,7 +186,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"print(\"Writing to Redshift...\")\n",
Expand All @@ -195,7 +214,9 @@
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"pd.options.display.max_rows = 2\n",
Expand All @@ -222,7 +243,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.6.3"
"version": "3.6.2"
},
"notice": "Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the \"License\"). You may not use this file except in compliance with the License. A copy of the License is located at http://aws.amazon.com/apache2.0/ or in the \"license\" file accompanying this file. This file is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License."
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -301,7 +301,7 @@
" 'InstanceType':'ml.m4.xlarge',\n",
" 'InitialInstanceCount':1,\n",
" 'InitialVariantWeight':1,\n",
" 'ModelName':model_file_name,\n",
" 'ModelName':model_name,\n",
" 'VariantName':'AllTraffic'}])\n",
"\n",
"print(\"Endpoint Config Arn: \" + create_endpoint_config_response['EndpointConfigArn'])"
Expand Down