|
4 | 4 | "cell_type": "markdown",
|
5 | 5 | "metadata": {},
|
6 | 6 | "source": [
|
7 |
| - "# Building your own algorithm container\n", |
| 7 | + "# Building your own TensorFlow container\n", |
8 | 8 | "\n",
|
9 |
| - "With Amazon SageMaker, you can package your own algorithms that can than be trained and deployed in the SageMaker environment. This notebook will guide you through an example that shows you how to build a Docker container for SageMaker and use it for training and inference.\n", |
| 9 | + "With Amazon SageMaker, you can package your own algorithms that can than be trained and deployed in the SageMaker environment. This notebook will guide you through an example, using TensorFlow, that shows you how to build a Docker container for SageMaker and use it for training and inference.\n", |
10 | 10 | "\n",
|
11 | 11 | "By packaging an algorithm in a container, you can bring almost any code to the Amazon SageMaker environment, regardless of programming language, environment, framework, or dependencies. \n",
|
12 | 12 | "\n",
|
|
51 | 51 | "\n",
|
52 | 52 | "## The example\n",
|
53 | 53 | "\n",
|
54 |
| - "Here, we'll show how to package a custom TensorFlow container with a python example which works with the CIFAR-10 dataset and utilizes TensorFlow Serving for inferences. The example is purposefully fairly trivial since the point is to show the surrounding structure that you'll want to add to your own code so you can train and host it in Amazon SageMaker.\n", |
| 54 | + "Here, we'll show how to package a custom TensorFlow container with a Python example which works with the CIFAR-10 dataset and utilizes TensorFlow Serving for inferences. The example is purposefully fairly trivial since the point is to show the surrounding structure that you'll want to add to your own code so you can train and host it in Amazon SageMaker.\n", |
55 | 55 | "\n",
|
56 | 56 | "The ideas shown here will work in any language or environment. You'll need to choose the right tools for your environment to serve HTTP requests for inference, but good HTTP environments are available in every language these days.\n",
|
57 | 57 | "\n",
|
|
365 | 365 | "1. IAM role - our AWS execution role\n",
|
366 | 366 | "2. train_instance_count - number of instances to use for training.\n",
|
367 | 367 | "3. train_instance_type - type of instance to use for training. For training locally, we will specify `local`.\n",
|
368 |
| - "4. image_name - our custom TensorFlow docker image we created.\n", |
| 368 | + "4. image_name - our custom TensorFlow Docker image we created.\n", |
369 | 369 | "5. hyperparameters - hyperparameters we want to pass.\n",
|
370 | 370 | "\n",
|
371 |
| - "Lets start with setting up our IAM role. We will make use of a helper function within the Python SDK. This function will throw an exception if run outside of a SageMaker notebook instance, as it gets metadata from the notebook instance. If running outside, please provide an IAM role with proper access stated above in [Permissions](#Permissions)." |
| 371 | + "Let's start with setting up our IAM role. We will make use of a helper function within the Python SDK. This function will throw an exception if run outside of a SageMaker notebook instance, as it gets metadata from the notebook instance. If running outside, please provide an IAM role with proper access stated above in [Permissions](#Permissions)." |
372 | 372 | ]
|
373 | 373 | },
|
374 | 374 | {
|
|
539 | 539 | "\n",
|
540 | 540 | "data = {'instances': numpy.asarray(image).astype(float).tolist()}\n",
|
541 | 541 | "\n",
|
| 542 | + "# The request and response format is JSON for TensorFlow Serving.\n", |
| 543 | + "# For more information: https://www.tensorflow.org/serving/api_rest#predict_api\n", |
542 | 544 | "predictor.accept = 'application/json'\n",
|
543 | 545 | "predictor.content_type = 'application/json'\n",
|
544 | 546 | "\n",
|
545 | 547 | "predictor.serializer = json_serializer\n",
|
546 | 548 | "predictor.deserializer = json_deserializer\n",
|
547 | 549 | "\n",
|
| 550 | + "# For more information on the predictor class.\n", |
| 551 | + "# https://github.com/aws/sagemaker-python-sdk/blob/master/src/sagemaker/predictor.py\n", |
548 | 552 | "predictor.predict(data)"
|
549 | 553 | ]
|
550 | 554 | },
|
|
745 | 749 | "endpoint_name = predictor.endpoint\n",
|
746 | 750 | "\n",
|
747 | 751 | "response = client.invoke_endpoint(EndpointName=endpoint_name, Body=json.dumps(data))\n",
|
748 |
| - "response_body = response['Body']\n", |
| 752 | + "response_body = response['Body'].decode('utf-8')\n", |
749 | 753 | "\n",
|
750 | 754 | "print(response_body.read())"
|
751 | 755 | ]
|
|
781 | 785 | "- [Dockerfile](https://docs.docker.com/engine/reference/builder/)\n",
|
782 | 786 | "- [scikit-bring-your-own](https://github.com/awslabs/amazon-sagemaker-examples/blob/master/advanced_functionality/scikit_bring_your_own/scikit_bring_your_own.ipynb)"
|
783 | 787 | ]
|
784 |
| - }, |
785 |
| - { |
786 |
| - "cell_type": "code", |
787 |
| - "execution_count": null, |
788 |
| - "metadata": {}, |
789 |
| - "outputs": [], |
790 |
| - "source": [] |
791 | 788 | }
|
792 | 789 | ],
|
793 | 790 | "metadata": {
|
|
0 commit comments