From 8f3ea81e51e5172677611487edbfb3f0ac094d6d Mon Sep 17 00:00:00 2001 From: atqy Date: Fri, 5 Aug 2022 11:11:11 -0700 Subject: [PATCH] fix links and incorrectly used code blocks --- .../hpo_huggingface_text_classification_20_newsgroups.ipynb | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/hyperparameter_tuning/huggingface_multiclass_text_classification_20_newsgroups/hpo_huggingface_text_classification_20_newsgroups.ipynb b/hyperparameter_tuning/huggingface_multiclass_text_classification_20_newsgroups/hpo_huggingface_text_classification_20_newsgroups.ipynb index 0029cfa3d0..db4657f5bb 100644 --- a/hyperparameter_tuning/huggingface_multiclass_text_classification_20_newsgroups/hpo_huggingface_text_classification_20_newsgroups.ipynb +++ b/hyperparameter_tuning/huggingface_multiclass_text_classification_20_newsgroups/hpo_huggingface_text_classification_20_newsgroups.ipynb @@ -14,7 +14,7 @@ "Text Classification can be used to solve various use-cases like sentiment analysis, spam detection, hashtag prediction etc. \n", "\n", "\n", - "This notebook demonstrates the use of the [HuggingFace `transformers` library](https://huggingface.co/transformers/) together with a custom Amazon sagemaker-sdk extension to fine-tune a pre-trained transformer on multi class text classification. In particular, the pre-trained model will be fine-tuned using the [`20 newsgroups dataset`](http://qwone.com/~jason/20Newsgroups/). To get started, we need to set up the environment with a few prerequisite steps, for permissions, configurations, and so on." + "This notebook demonstrates the use of the [HuggingFace Transformers library](https://huggingface.co/transformers/) together with a custom Amazon sagemaker-sdk extension to fine-tune a pre-trained transformer on multi class text classification. In particular, the pre-trained model will be fine-tuned using the [20 Newsgroups dataset](http://qwone.com/~jason/20Newsgroups/). To get started, we need to set up the environment with a few prerequisite steps, for permissions, configurations, and so on." ] }, { @@ -107,7 +107,7 @@ "\n", "Now we'll download a dataset from the web on which we want to train the text classification model.\n", "\n", - "In this example, let us train the text classification model on the [`20 newsgroups dataset`](http://qwone.com/~jason/20Newsgroups/). The `20 newsgroups dataset` consists of 20000 messages taken from 20 Usenet newsgroups." + "In this example, let us train the text classification model on the [20 Newsgroups dataset](http://qwone.com/~jason/20Newsgroups/). The 20 Newsgroups dataset consists of 20000 messages taken from 20 Usenet newsgroups." ] }, { @@ -1040,7 +1040,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now, let's define the SageMaker `HuggingFace` estimator with resource configurations and hyperparameters to train Text Classification on `20 newsgroups` dataset, running on a `p3.2xlarge` instance." + "Now, let's define the SageMaker `HuggingFace` estimator with resource configurations and hyperparameters to train Text Classification on 20 Newsgroups dataset, running on a `p3.2xlarge` instance." ] }, {