Skip to content

Commit fc9dc39

Browse files
authored
Merge branch 'dev' into fix-local-mode-root-files
2 parents 1a533e8 + 6319bce commit fc9dc39

26 files changed

+4082
-468
lines changed

CHANGELOG.md

+9
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,14 @@
11
# Changelog
22

3+
## v2.72.1 (2021-12-20)
4+
5+
### Bug Fixes and Other Changes
6+
7+
* typos and broken link
8+
* S3Input - add support for instance attributes
9+
* Prevent repack_model script from referencing nonexistent directories
10+
* Set ProcessingStep upload locations deterministically to avoid c…
11+
312
## v2.72.0 (2021-12-13)
413

514
### Features

VERSION

+1-1
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
2.72.1.dev0
1+
2.72.2.dev0

doc/api/training/smd_data_parallel.rst

+3-3
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
1-
##########################
2-
Distributed data parallel
3-
##########################
1+
###############################################
2+
The SageMaker Distributed Data Parallel Library
3+
###############################################
44

55
SageMaker's distributed data parallel library extends SageMaker’s training
66
capabilities on deep learning models with near-linear scaling efficiency,

doc/api/training/smd_model_parallel.rst

+25-39
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
1-
Distributed model parallel
2-
--------------------------
1+
The SageMaker Distributed Model Parallel Library
2+
------------------------------------------------
33

44
The Amazon SageMaker distributed model parallel library is a model parallelism library for training
55
large deep learning models that were previously difficult to train due to GPU memory limitations.
@@ -9,49 +9,35 @@ allowing you to increase prediction accuracy by creating larger models with more
99
You can use the library to automatically partition your existing TensorFlow and PyTorch workloads
1010
across multiple GPUs with minimal code changes. The library's API can be accessed through the Amazon SageMaker SDK.
1111

12-
Use the following sections to learn more about the model parallelism and the library.
13-
14-
Use with the SageMaker Python SDK
15-
=================================
16-
17-
Use the following page to learn how to configure and enable distributed model parallel
18-
when you configure an Amazon SageMaker Python SDK `Estimator`.
12+
See the following sections to learn more about the SageMaker model parallel library APIs.
1913

2014
.. toctree::
21-
:maxdepth: 1
15+
:maxdepth: 3
2216

17+
smp_versions/latest
2318
smd_model_parallel_general
2419

25-
API Documentation
26-
=================
27-
28-
The library contains a Common API that is shared across frameworks, as well as APIs
29-
that are specific to supported frameworks, TensorFlow and PyTorch.
30-
31-
Select a version to see the API documentation for version. To use the library, reference the
32-
**Common API** documentation alongside the framework specific API documentation.
33-
34-
.. toctree::
35-
:maxdepth: 1
36-
37-
smp_versions/latest.rst
38-
smp_versions/v1_3_0.rst
39-
smp_versions/v1_2_0.rst
40-
smp_versions/v1_1_0.rst
41-
42-
It is recommended to use this documentation alongside `SageMaker Distributed Model Parallel
43-
<http://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel.html>`__ in the Amazon SageMaker
44-
developer guide. This developer guide documentation includes:
4520

46-
- An overview of model parallelism and the library
47-
`core features <https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-core-features.html>`__
48-
- Instructions on how to modify `TensorFlow
49-
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-training-script.html#model-parallel-customize-training-script-tf>`__
50-
and `PyTorch
51-
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-training-script.html#model-parallel-customize-training-script-pt>`__
52-
training scripts
53-
- `Configuration tips and pitfalls
54-
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-tips-pitfalls.html>`__
21+
.. tip::
22+
23+
We recommended using this API documentation with the conceptual guide at
24+
`SageMaker's Distributed Model Parallel
25+
<http://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel.html>`_
26+
in the *Amazon SageMaker developer guide*. This developer guide documentation includes:
27+
28+
- An overview of model parallelism, and the library's
29+
`core features <https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-core-features.html>`_,
30+
and `extended features for PyTorch <https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-extended-features-pytorch.html>`_.
31+
- Instructions on how to modify `TensorFlow
32+
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-training-script-tf.html>`_
33+
and `PyTorch
34+
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-training-script-pt.html>`_
35+
training scripts.
36+
- Instructions on how to `run a distributed training job using the SageMaker Python SDK
37+
and the SageMaker model parallel library
38+
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-sm-sdk.html>`_.
39+
- `Configuration tips and pitfalls
40+
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-tips-pitfalls.html>`_.
5541

5642

5743
.. important::

0 commit comments

Comments
 (0)