|
83 | 83 | "### Dataset\n",
|
84 | 84 | "The dataset we are using is from [Caltech Birds (CUB 200 2011)](http://www.vision.caltech.edu/visipedia/CUB-200-2011.html). \n",
|
85 | 85 | "\n",
|
86 |
| - "Here we are using the artifacts from previous labs, in order to get the s3 location below for the Test images and Test data annotation file, and classification model.\n", |
| 86 | + "Here we are using the artifacts from previous labs:\n", |
87 | 87 | "\n",
|
88 | 88 | "- S3 path for test image data\n",
|
89 | 89 | "- S3 path for test data annotation file\n",
|
|
119 | 119 | "metadata": {},
|
120 | 120 | "source": [
|
121 | 121 | "### Prepare the Script and Docker File\n",
|
122 |
| - "With SageMaker, you can run data processing jobs using the SKLearnProcessor, popular ML frameworks processors, Apache Spark, or BYOC. To learn more about [SageMaker Processing](https://docs.aws.amazon.com/sagemaker/latest/dg/processing-job.html)\n", |
| 122 | + "With SageMaker, you can run data processing jobs using the SKLearnProcessor, popular ML frameworks processors, Apache Spark, or BYOC. To learn more about visit [SageMaker Processing](https://docs.aws.amazon.com/sagemaker/latest/dg/processing-job.html).\n", |
123 | 123 | "\n",
|
124 |
| - "For this example we are going to practice using ScriptProcess and Bring Our Own Container (BYOC). ScriptProcess require you to feed a container uri from ECR and a custom script for the process." |
| 124 | + "For this example we are going to practice using ScriptProcess and Bring Your Own Container (BYOC). ScriptProcess require you to feed a container URI from ECR and a custom script for the process." |
125 | 125 | ]
|
126 | 126 | },
|
127 | 127 | {
|
|
143 | 143 | "cell_type": "markdown",
|
144 | 144 | "metadata": {},
|
145 | 145 | "source": [
|
146 |
| - "#### Bring Our Own Container (BYOC)\n", |
| 146 | + "#### Bring Your Own Container (BYOC)\n", |
147 | 147 | "Below we build a custom docker container and push to Amazon Elastic Container Registry (ECR).\n",
|
148 | 148 | "\n",
|
149 | 149 | "You can use the standard TFflow container, but ScriptProcessor currently does not support `source_dir` for custom requirement.txt and multiple python file. That is on the roadmap, please follow this [thread](https://github.com/aws/sagemaker-python-sdk/issues/1248) for updates."
|
|
204 | 204 | "source": [
|
205 | 205 | "The easiest way to build a container image and push to ECR is to use studio image builder. This require certain permission for your sagemaker execution role, which is already provided in this setup. \n",
|
206 | 206 | "\n",
|
207 |
| - "But please check this [blog](https://aws.amazon.com/blogs/machine-learning/using-the-amazon-sagemaker-studio-image-build-cli-to-build-container-images-from-your-studio-notebooks/) for additional information on how to use the Amazon SageMaker Studio Image Build CLI to build container images from your Studio notebooks in case you need to update your role policy. " |
| 207 | + "But please check this [blog](https://aws.amazon.com/blogs/machine-learning/using-the-amazon-sagemaker-studio-image-build-cli-to-build-container-images-from-your-studio-notebooks/) for additional information on how to use the Amazon SageMaker studio image build cli to build container images from your studio notebooks in case you need to update your role policy. " |
208 | 208 | ]
|
209 | 209 | },
|
210 | 210 | {
|
|
0 commit comments