|
4 | 4 | "cell_type": "markdown",
|
5 | 5 | "metadata": {},
|
6 | 6 | "source": [
|
7 |
| - "# Creating, training and serving Estimators in SageMaker\n", |
| 7 | + "# Creating, training, and serving using SageMaker Estimators\n", |
8 | 8 | "\n",
|
9 |
| - "The **SageMaker Python SDK** helps you deploy your models for training and hosting in optimized, productions ready containers in SageMaker. The SageMaker Python SDK is easy to use, modular, extensible and compatible with TensorFlow and MXNet. This tutorial focuses on **TensorFlow** and shows how we can train and host a tensorflow DNNClassifier estimator in SageMaker using the Python SDK.\n", |
| 9 | + "The **SageMaker Python SDK** helps you deploy your models for training and hosting in optimized, production ready containers in SageMaker. The SageMaker Python SDK is easy to use, modular, extensible and compatible with TensorFlow and MXNet. This tutorial focuses on **TensorFlow** and shows how we can train and host a TensorFlow DNNClassifier estimator in SageMaker using the Python SDK.\n", |
10 | 10 | "\n",
|
11 | 11 | "\n",
|
12 | 12 | "TensorFlow's high-level machine learning API (tf.estimator) makes it easy to\n",
|
13 | 13 | "configure, train, and evaluate a variety of machine learning models.\n",
|
14 | 14 | "\n",
|
15 | 15 | "\n",
|
16 |
| - "In this\n", |
17 |
| - "tutorial, you'll use tf.estimator to construct a\n", |
| 16 | + "In this tutorial, you'll use tf.estimator to construct a\n", |
18 | 17 | "[neural network](https://en.wikipedia.org/wiki/Artificial_neural_network)\n",
|
19 | 18 | "classifier and train it on the\n",
|
20 | 19 | "[Iris data set](https://en.wikipedia.org/wiki/Iris_flower_data_set) to\n",
|
21 | 20 | "predict flower species based on sepal/petal geometry. You'll write code to\n",
|
22 | 21 | "perform the following five steps:\n",
|
23 | 22 | "\n",
|
24 |
| - "1. Deploy a tensorflow container in SageMaker\n", |
| 23 | + "1. Deploy a TensorFlow container in SageMaker\n", |
25 | 24 | "2. Load CSVs containing Iris training/test data from a S3 bucket into a TensorFlow `Dataset`\n",
|
26 | 25 | "3. Construct a `tf.estimator.DNNClassifier` neural network classifier\n",
|
27 | 26 | "4. Train the model using the training data\n",
|
|
77 | 76 | " iris_training.csv\n",
|
78 | 77 | "* A test set of 30 samples\n",
|
79 | 78 | " iris_test.csv\n",
|
80 |
| - " \n", |
81 |
| - "These files are provided in the SageMaker sample data bucket: \n", |
82 |
| - "s3://sagemaker-sample-data/tensorflow/iris" |
| 79 | + "\n", |
| 80 | + "These files are provided in the SageMaker sample data bucket:\n", |
| 81 | + "**s3://sagemaker-sample-data-{region}/tensorflow/iris**. Copies of the bucket exist in each SageMaker region. When we access the data, we'll replace {region} with the AWS region the notebook is running in." |
83 | 82 | ]
|
84 | 83 | },
|
85 | 84 | {
|
|
99 | 98 | "source": [
|
100 | 99 | "#Bucket location to save your custom code in tar.gz format.\n",
|
101 | 100 | "custom_code_upload_location = 's3://<bucket-name>/customcode/tensorflow_iris'\n",
|
| 101 | + "\n", |
102 | 102 | "#Bucket location where results of model training are saved.\n",
|
103 | 103 | "model_artifacts_location = 's3://<bucket-name>/artifacts'\n",
|
| 104 | + "\n", |
| 105 | + "#IAM execution role that gives SageMaker access to resources in your AWS account.\n", |
104 | 106 | "role='<your SageMaker execution role here>'"
|
105 | 107 | ]
|
106 | 108 | },
|
|
138 | 140 | "cell_type": "markdown",
|
139 | 141 | "metadata": {},
|
140 | 142 | "source": [
|
141 |
| - "# Construct a Deep Neural Network Classifier" |
| 143 | + "# Construct a deep neural network classifier" |
142 | 144 | ]
|
143 | 145 | },
|
144 | 146 | {
|
145 | 147 | "cell_type": "markdown",
|
146 | 148 | "metadata": {},
|
147 | 149 | "source": [
|
148 |
| - "## Complete Neural Network Source Code \n", |
| 150 | + "## Complete neural network source code \n", |
| 151 | + "\n", |
149 | 152 | "Here is the full code for the neural network classifier:"
|
150 | 153 | ]
|
151 | 154 | },
|
|
155 | 158 | "metadata": {},
|
156 | 159 | "outputs": [],
|
157 | 160 | "source": [
|
158 |
| - "!cat \"iris_dnn_classifier.py\"" |
| 161 | + "!cat \"/home/ec2-user/sample-notebooks/sagemaker-python-sdk/tensorflow_iris_dnn_classifier_using_estimators/iris_dnn_classifier.py\"" |
159 | 162 | ]
|
160 | 163 | },
|
161 | 164 | {
|
|
170 | 173 | "metadata": {},
|
171 | 174 | "source": [
|
172 | 175 | "### Using a tf.estimator in SageMaker\n",
|
173 |
| - "Using an estimator in SageMaker is very easy, you can create one with few lines of code:" |
| 176 | + "Using a TensorFlow estimator in SageMaker is very easy, you can create one with few lines of code:" |
174 | 177 | ]
|
175 | 178 | },
|
176 | 179 | {
|
|
245 | 248 | "source": [
|
246 | 249 | "### Describe the serving input pipeline:\n",
|
247 | 250 | "\n",
|
248 |
| - "After traininng your model, SageMaker will host it in a tensorflow serving. You need to describe a serving input function:" |
| 251 | + "After traininng your model, SageMaker will host it in a TensorFlow serving. You need to describe a serving input function:" |
249 | 252 | ]
|
250 | 253 | },
|
251 | 254 | {
|
|
270 | 273 | "cell_type": "markdown",
|
271 | 274 | "metadata": {},
|
272 | 275 | "source": [
|
273 |
| - "# Train a Model on Amazon SageMaker Using TensorFlow Custom Code\n", |
| 276 | + "# Train a Model on Amazon SageMaker using TensorFlow custom code\n", |
274 | 277 | "\n",
|
275 | 278 | "We can use the SDK to run our local training script on SageMaker infrastructure.\n",
|
276 | 279 | "\n",
|
|
292 | 295 | " code_location=custom_code_upload_location,\n",
|
293 | 296 | " train_instance_count=1,\n",
|
294 | 297 | " train_instance_type='ml.c4.xlarge',\n",
|
295 |
| - " hyperparameters={'training_steps': 100})\n", |
296 |
| - "\n" |
| 298 | + " training_steps=1000,\n", |
| 299 | + " evaluation_steps=100)" |
297 | 300 | ]
|
298 | 301 | },
|
299 | 302 | {
|
|
305 | 308 | "%%time\n",
|
306 | 309 | "import boto3\n",
|
307 | 310 | "\n",
|
| 311 | + "# use the region-specific sample data bucket\n", |
308 | 312 | "region = boto3.Session().region_name\n",
|
309 | 313 | "train_data_location = 's3://sagemaker-sample-data-{}/tensorflow/iris'.format(region)\n",
|
310 | 314 | "\n",
|
|
315 | 319 | "cell_type": "markdown",
|
316 | 320 | "metadata": {},
|
317 | 321 | "source": [
|
318 |
| - "# Deploy the Trained Model \n", |
| 322 | + "# Deploy the trained Model \n", |
319 | 323 | "\n",
|
320 | 324 | "The deploy() method creates an endpoint which serves prediction requests in real-time."
|
321 | 325 | ]
|
|
329 | 333 | "outputs": [],
|
330 | 334 | "source": [
|
331 | 335 | "%%time\n",
|
332 |
| - "\n", |
333 | 336 | "iris_predictor = iris_estimator.deploy(initial_instance_count=1,\n",
|
334 | 337 | " instance_type='ml.c4.xlarge')"
|
335 | 338 | ]
|
|
338 | 341 | "cell_type": "markdown",
|
339 | 342 | "metadata": {},
|
340 | 343 | "source": [
|
341 |
| - "# Invoke the Endpoint to Get Inferences" |
| 344 | + "# Invoke the Endpoint to get inferences" |
342 | 345 | ]
|
343 | 346 | },
|
344 | 347 | {
|
|
366 | 369 | "cell_type": "markdown",
|
367 | 370 | "metadata": {},
|
368 | 371 | "source": [
|
369 |
| - "# (Optional) Delete the Endpoint" |
| 372 | + "# (Optional) Delete the Endpoint\n", |
| 373 | + "\n", |
| 374 | + "After you have finished with this example, remember to delete the prediction endpoint to release the instance(s) associated with it." |
370 | 375 | ]
|
371 | 376 | },
|
372 | 377 | {
|
|
398 | 403 | }
|
399 | 404 | ],
|
400 | 405 | "metadata": {
|
401 |
| - "notice": "Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the \"License\"). You may not use this file except in compliance with the License. A copy of the License is located at http://aws.amazon.com/apache2.0/ or in the \"license\" file accompanying this file. This file is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.", |
402 | 406 | "kernelspec": {
|
403 | 407 | "display_name": "Environment (conda_tensorflow_p27)",
|
404 | 408 | "language": "python",
|
|
415 | 419 | "nbconvert_exporter": "python",
|
416 | 420 | "pygments_lexer": "ipython2",
|
417 | 421 | "version": "2.7.14"
|
418 |
| - } |
| 422 | + }, |
| 423 | + "notice": "Copyright 2017 Amazon.com, Inc. or its affiliates. All Rights Reserved. Licensed under the Apache License, Version 2.0 (the \"License\"). You may not use this file except in compliance with the License. A copy of the License is located at http://aws.amazon.com/apache2.0/ or in the \"license\" file accompanying this file. This file is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License." |
419 | 424 | },
|
420 | 425 | "nbformat": 4,
|
421 | 426 | "nbformat_minor": 1
|
|
0 commit comments