Skip to content
This repository was archived by the owner on Oct 4, 2024. It is now read-only.

Commit 08dba3e

Browse files
committed
older resources section and explanations
1 parent 8af6abd commit 08dba3e

File tree

1 file changed

+15
-9
lines changed

1 file changed

+15
-9
lines changed

README.md

Lines changed: 15 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -2,38 +2,44 @@
22

33
This repository contains examples and related resources regarding Amazon SageMaker Script Mode and SageMaker Processing. With Script Mode, you can use training scripts similar to those you would use outside SageMaker with SageMaker's prebuilt containers for various frameworks such TensorFlow, PyTorch, and Apache MXNet. Similarly, in SageMaker Processing, you can supply ordinary data preprocessing scripts for almost any language or technology you wish to use, such as the R programming language.
44

5-
Currently this repository has resources for **TensorFlow**, **Bring Your Own** (BYO models and for a Script Mode-style experience with your own containers), and **Miscellaneous** (Script Mode-style experience for SageMaker Processing etc.).
5+
Currently this repository has resources for **TensorFlow**, **Bring Your Own** (BYO models and for a Script Mode-style experience with your own containers), and **Miscellaneous** (Script Mode-style experience for SageMaker Processing etc.). There also is an **Older Resources** section with examples of older framework versions.
66

7-
- **TensorFlow resources:**
7+
For those new to SageMaker, there is a set of 2-hour workshops covering the basics at https://github.com/awslabs/amazon-sagemaker-workshop.
88

9-
- [**TensorFlow 2 Workflow**](tf-2-workflow): This example shows a complete workflow for TensorFlow 2. To begin, SageMaker Processing is used to transform the dataset. Next, Local Mode training and Local Mode endpoints are demonstrated for prototyping training and inference code, respectively. Automatic Model Tuning is used to automate the hyperparameter tuning process. Additionally, the AWS Step Functions Data Science SDK is used to automate the project workflow for production-ready environments outside notebooks. **PREREQUISITES:** From the *tf-2-workflow* directory, upload ONLY the Jupyter notebook `tf-2-workflow.ipynb`.
9+
- **TensorFlow Resources:**
1010

11-
- [**TensorFlow 2 Sentiment Analysis**](tf-sentiment-script-mode): SageMaker's prebuilt TensorFlow 2 container is used in this example to train a custom sentiment analysis model. Particular attention is paid to doing distributed hosted training in SageMaker with a multi-GPU instance, and SageMaker Batch Transform for asynchronous, large scale inference. **PREREQUISITES:** From the *tf-sentiment-script-mode* directory, upload ONLY the Jupyter notebook `sentiment-analysis.ipynb`.
11+
- [**TensorFlow 2 Sentiment Analysis**](tf-sentiment-script-mode): SageMaker's prebuilt TensorFlow 2 container is used in this example to train a custom sentiment analysis model. Distributed hosted training in SageMaker is performed on a multi-GPU instance, and SageMaker Batch Transform is used for asynchronous, large scale inference/batch scoring. **PREREQUISITES:** From the *tf-sentiment-script-mode* directory, upload ONLY the Jupyter notebook `sentiment-analysis.ipynb`.
1212

13-
- [**TensorFlow Distributed Training Options**](tf-distribution-options): This example demonstrates two different distributed training options in SageMaker's Script Mode: (1) parameter servers, and (2) Horovod. **PREREQUISITES:** From the *tf-distribution-options* directory, upload ONLY the Jupyter notebook `tf-distributed-training.ipynb`.
13+
- [**TensorFlow 2 Workflow**](tf-2-workflow): This example shows a complete workflow for TensorFlow 2 with automation by the AWS Step Functions Data Science SDK, an older alternative to [Amazon SageMaker Pipelines](https://aws.amazon.com/sagemaker/pipelines). To begin, SageMaker Processing is used to transform the dataset. Next, Local Mode training and Local Mode endpoints are demonstrated for prototyping training and inference code, respectively. Automatic Model Tuning is used to automate the hyperparameter tuning process. **PREREQUISITES:** From the *tf-2-workflow* directory, upload ONLY the Jupyter notebook `tf-2-workflow.ipynb`.
1414

1515
- [**TensorFlow Highly Performant Batch Inference & Training**](tf-batch-inference-script): The focus of this example is highly performant batch inference using TensorFlow Serving, along with Horovod distributed training. To transform the input image data for inference, a preprocessing script is used with the Amazon SageMaker TensorFlow Serving container. **PREREQUISITES:** be sure to upload all files in the *tf-batch-inference-script* directory (including the subdirectory code and files) to the directory where you will run the related Jupyter notebook.
1616

1717
- [**TensorFlow Text Classification with Word Embeddings**](tf-word-embeddings): In this example, TensorFlow's tf.keras API is used with Script Mode for a text classification task. An important aspect of the example is showing how to load preexisting word embeddings such as GloVe in Script Mode. Other features demonstrated include Local Mode endpoints as well as Local Mode training. **PREREQUISITES:** (1) Use a GPU-based (P3 or P2) SageMaker notebook instance, and (2) be sure to upload all files in the *tf-word-embeddings* directory (including subdirectory *code*) to the directory where you will run the related Jupyter notebook.
1818

1919
- [**TensorFlow with Horovod & Inference Pipeline**](tf-horovod-inference-pipeline): Script Mode with TensorFlow is used for a computer vision task, in a demonstration of Horovod distributed training and doing batch inference in conjunction with an Inference Pipeline for transforming image data before inputting it to the model container. This is an alternative to the previous example, which uses a preprocessing script with the Amazon SageMaker TensorFlow Serving Container rather than an Inference Pipeline. **PREREQUISITES:** be sure to upload all files in the *tf-horovod-inference-pipeline* directory (including the subdirectory code and files) to the directory where you will run the related Jupyter notebook.
2020

21-
- [**TensorFlow Eager Execution**](tf-eager-script-mode): NOTE: This example has been superseded by the **TensorFlow 2 Workflow** example above. This example shows how to use Script Mode with Eager Execution mode in TensorFlow 1.x, a more intuitive and dynamic alternative to the original graph mode of TensorFlow. It is the default mode of TensorFlow 2. Local Mode and Automatic Model Tuning also are demonstrated. **PREREQUISITES:** From the *tf-eager-script-mode* directory, upload ONLY the Jupyter notebook `tf-boston-housing.ipynb`.
2221

23-
24-
- **Bring Your Own (BYO) resources:**
22+
- **Bring Your Own (BYO) Resources:**
2523

2624
- [**lightGBM BYO**](lightgbm-byo): In this repository, most samples use Amazon SageMaker prebuilt framework containers for TensorFlow and other frameworks. For this example, however, we'll show how to BYO container to create a Script Mode-style experience similar to a prebuilt SageMaker framework container, using lightGBM, a popular gradient boosting framework. **PREREQUISITES:** From the *lightgbm-byo* directory, upload the Jupyter notebook `lightgbm-byo.ipynb`.
2725

2826
- [**Deploy Pretrained Models**](deploy-pretrained-model): SageMaker's prebuilt PyTorch container is used to demonstrate how you can quickly take a pretrained or locally trained model and deploy them as SageMaker hosted API endpoints. There are examples for both OpenAI's GPT-2 and BERT. **PREREQUISITES:** From the *deploy-pretrained-model* directory, upload the entire BERT or GPT2 folder's contents, depending on which model you select. Run either `Deploy_BERT.pynb` or `Deploy_GPT2.ipynb`.
2927

3028

31-
- **Miscellaneous resources:**
29+
- **Miscellaneous Resources:**
3230

3331
- [**K-means clustering**](k-means-clustering): Most of the samples in this repository involve supervised learning tasks in Amazon SageMaker Script Mode. For this example, by contrast, we'll undertake an unsupervised learning task, and do so with the Amazon SageMaker K-means built-in algorithm rather than Script Mode. **PREREQUISITES:** From the *k-means-clustering* directory, upload the Jupyter notebook `k-means-clustering.ipynb`.
3432

3533
- [**R in SageMaker Processing**](r-in-sagemaker-processing): In this example, R is used to perform some operations on a dataset and generate a plot within SageMaker Processing. The job results including the plot image are retrieved and displayed, demonstrating how R can be easily used within a SageMaker workflow. **PREREQUISITES:** From the *r-in-sagemaker-processing* directory, upload the Jupyter notebook `r-in-sagemaker_processing.ipynb`.
3634

35+
36+
- **Older Resources:**
37+
38+
- [**TensorFlow Distributed Training Options**](tf-distribution-options): **NOTE**: Besides the options listed here for TensorFlow 1.x, there are additional options for TensorFlow 2, including (1) built-in [**SageMaker Distributed Training**](https://aws.amazon.com/sagemaker/distributed-training/) for both data and model parallelism, and native distribution strategies such as MirroredStrategy as demonstrated in the **TensorFlow 2 Sentiment Analysis** example above. This TensorFlow 1.x example demonstrates two other distributed training options for SageMaker's Script Mode: (1) parameter servers, and (2) Horovod. **PREREQUISITES:** From the *tf-distribution-options* directory, upload ONLY the Jupyter notebook `tf-distributed-training.ipynb`.
39+
40+
- [**TensorFlow Eager Execution**](tf-eager-script-mode): **NOTE**: This TensorFlow 1.x example has been superseded by the **TensorFlow 2 Workflow** example above. This example shows how to use Script Mode with Eager Execution mode in TensorFlow 1.x, a more intuitive and dynamic alternative to the original graph mode of TensorFlow. It is the default mode of TensorFlow 2. Local Mode and Automatic Model Tuning also are demonstrated. **PREREQUISITES:** From the *tf-eager-script-mode* directory, upload ONLY the Jupyter notebook `tf-boston-housing.ipynb`.
41+
42+
3743

3844
## License
3945

0 commit comments

Comments
 (0)