Skip to content

Commit

Permalink
TF IC notebook
Browse files Browse the repository at this point in the history
  • Loading branch information
username committed Aug 26, 2022
1 parent db91f7a commit a9e3f54
Showing 1 changed file with 7 additions and 27 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
"metadata": {},
"source": [
"---\n",
"Welcome to Amazon SageMaker [Built-in Algorithms](https://sagemaker.readthedocs.io/en/stable/algorithms/index.html)! You can use SageMaker Built-in algorithms to solve many Machine Learning tasks through [SageMaker Python SDK](https://sagemaker.readthedocs.io/en/stable/overview.html). You can also use these algorithms through one-click in SageMaker Studio via [JumpStart](https://docs.aws.amazon.com/sagemaker/latest/dg/studio-jumpstart.html).\n",
"Welcome to [Amazon SageMaker Built-in Algorithms](https://sagemaker.readthedocs.io/en/stable/algorithms/index.html)! You can use SageMaker Built-in algorithms to solve many Machine Learning tasks through [SageMaker Python SDK](https://sagemaker.readthedocs.io/en/stable/overview.html). You can also use these algorithms through one-click in SageMaker Studio via [JumpStart](https://docs.aws.amazon.com/sagemaker/latest/dg/studio-jumpstart.html).\n",
"\n",
"In this demo notebook, we demonstrate how to use the TensorFlow Image Classification algorithm. Image Classification refers to classifying an image to one of the class labels of the training dataset. We demonstrate two use cases of TensorFlow Image Classification models:\n",
"\n",
Expand Down Expand Up @@ -166,8 +166,7 @@
"source": [
"## 3. Run inference on the pre-trained model\n",
"***\n",
"Using SageMaker, we can perform inference on the pre-trained model, even without fine-tuning it first on a custom dataset. For this example, that means on an input image, predicting the class label from one of the 1000 classes of the ImageNet dataset. \n",
"[ImageNetLabels](https://storage.googleapis.com/download.tensorflow.org/data/ImageNetLabels.txt).\n",
"Using SageMaker, we can perform inference on the pre-trained model, even without fine-tuning it first on a custom dataset. For this example, that means on an input image, predicting the [class label from one of the 1000 classes of the ImageNet dataset](https://storage.googleapis.com/download.tensorflow.org/data/ImageNetLabels.txt).\n",
"***"
]
},
Expand Down Expand Up @@ -244,7 +243,7 @@
"source": [
"### 3.2. Download example images for inference\n",
"***\n",
"We download example images from the Built-In Algorithms S3 bucket.\n",
"We download example images from a public S3 bucket.\n",
"***"
]
},
Expand Down Expand Up @@ -343,21 +342,14 @@
"***\n",
"Previously, we saw how to run inference on a pre-trained model. Next, we discuss how a model can be fine-tuned to a custom dataset with any number of classes. \n",
"\n",
"The model available for fine-tuning attaches a classification layer to the corresponding feature extractor model available on TensorFlow/PyTorch hub, and initializes the layer parameters to random values. The output dimension of the classification layer\n",
"is determined based on the number of classes in the input data. The fine-tuning step fine-tunes the model parameters. The objective is to minimize classification error on the input data. The model returned by fine-tuning can be further deployed for inference. Below are the instructions for how the training data should be formatted for input to the model. \n",
"The model available for fine-tuning attaches a classification layer to the corresponding feature extractor model available on TensorFlow/PyTorch hub, and initializes the layer parameters to random values. The output dimension of the classification layer is determined based on the number of classes in the input data. The fine-tuning step fine-tunes the model parameters. The objective is to minimize classification error on the input data. The model returned by fine-tuning can be further deployed for inference. Below are the instructions for how the training data should be formatted for input to the model.\n",
"\n",
"- **Input:** A directory with as many sub-directories as the number of classes. \n",
" - Each sub-directory should have images belonging to that class in .jpg format. \n",
"- **Output:** A trained model that can be deployed for inference. \n",
" - A label mapping file is saved along with the trained model file on the s3 bucket. \n",
" \n",
"The input directory should look like below if \n",
"the training data contains images from two classes: roses and dandelion. The s3 path should look like\n",
"`s3://bucket_name/input_directory/`. Note the trailing `/` is required. The names of the folders and 'roses', 'dandelion', and the .jpg filenames\n",
"can be anything. The label mapping file that is saved along with the trained model on the s3 bucket maps the \n",
"folder names 'roses' and 'dandelion' to the indices in the list of class probabilities the model outputs.\n",
"The mapping follows alphabetical ordering of the folder names. In the example below, index 0 in the model output list\n",
"would correspond to 'dandelion' and index 1 would correspond to 'roses'.\n",
"The input directory should look like below if the training data contains images from two classes: roses and dandelion. The s3 path should look like `s3://bucket_name/input_directory/`. Note the trailing `/` is required. The names of the folders and 'roses', 'dandelion', and the .jpg filenames can be anything. The label mapping file that is saved along with the trained model on the s3 bucket maps the folder names 'roses' and 'dandelion' to the indices in the list of class probabilities the model outputs. The mapping follows alphabetical ordering of the folder names. In the example below, index 0 in the model output list would correspond to 'dandelion' and index 1 would correspond to 'roses'.\n",
"\n",
" input_directory\n",
" |--roses\n",
Expand All @@ -367,19 +359,7 @@
" |--ghi.jpg\n",
" |--jkl.jpg\n",
"\n",
"We provide tf_flowers dataset as a default dataset for fine-tuning the model. \n",
"tf_flower comprises images of five types of flowers. \n",
"The dataset has been downloaded from [TensorFlow](https://www.tensorflow.org/datasets/catalog/tf_flowers). \n",
"[Apache 2.0 License](https://jumpstart-cache-prod-us-west-2.s3-us-west-2.amazonaws.com/licenses/Apache-License/LICENSE-2.0.txt).\n",
"Citation:\n",
"<sub><sup>\n",
"@ONLINE {tfflowers,\n",
"author = \"The TensorFlow Team\",\n",
"title = \"Flowers\",\n",
"month = \"jan\",\n",
"year = \"2019\",\n",
"url = \"http://download.tensorflow.org/example_images/flower_photos.tgz\" }\n",
"</sup></sub> source: [TensorFlow Hub](model_url). \n",
"We provide tf_flowers dataset as a default dataset for fine-tuning the model. tf_flower comprises images of five types of flowers. The dataset has been downloaded from [TensorFlow](https://www.tensorflow.org/datasets/catalog/tf_flowers). [Apache 2.0 License](https://jumpstart-cache-prod-us-west-2.s3-us-west-2.amazonaws.com/licenses/Apache-License/LICENSE-2.0.txt).\n",
"***"
]
},
Expand Down Expand Up @@ -604,7 +584,7 @@
"source": [
"## 4.5. Deploy & run Inference on the fine-tuned model\n",
"***\n",
"A trained model does nothing on its own. We now want to use the model to perform inference. For this example, that means predicting the class label of an image. We follow the same steps as in [3. Run inference on the pre-trained model](#3.-Run-inference-on-the-pre-trained-model). We start by retrieving the artifacts for deploying an endpoint. However, instead of base_predictor, we deploy the `ic_estimator` that we fine-tuned.\n",
"A trained model does nothing on its own. We now want to use the model to perform inference. For this example, that means predicting the class label of an image. We follow the same steps as in Section 3 [Run inference on the pre-trained model](#3.-Run-inference-on-the-pre-trained-model). We start by retrieving the artifacts for deploying an endpoint. However, instead of base_predictor, we deploy the `ic_estimator` that we fine-tuned.\n",
"***"
]
},
Expand Down

0 comments on commit a9e3f54

Please sign in to comment.