Skip to content

Commit 35534b5

Browse files
committed
structural changes, fix links
1 parent ed8308d commit 35534b5

File tree

4 files changed

+54
-49
lines changed

4 files changed

+54
-49
lines changed

doc/api/training/smd_model_parallel.rst

+22-44
Original file line numberDiff line numberDiff line change
@@ -11,55 +11,33 @@ across multiple GPUs with minimal code changes. The library's API can be accesse
1111

1212
See the following sections to learn more about the SageMaker model parallel library APIs.
1313

14-
Use with the SageMaker Python SDK
15-
=================================
16-
17-
Walk through the following pages to learn about the library's APIs
18-
to configure and enable distributed model parallelism
19-
through an Amazon SageMaker estimator.
20-
2114
.. toctree::
22-
:maxdepth: 1
15+
:maxdepth: 3
2316

17+
smp_versions/latest
2418
smd_model_parallel_general
2519

26-
Use the Library's API to Adapt Training Scripts
27-
===============================================
28-
29-
The library provides Common APIs that you can use across frameworks,
30-
as well as framework-specific APIs for TensorFlow and PyTorch.
31-
32-
Select the latest or one of the previous versions of the API documentation
33-
depending on which version of the library you need to use.
34-
To use the library, reference the
35-
**Common API** documentation alongside the framework specific API documentation.
36-
37-
.. toctree::
38-
:maxdepth: 2
39-
40-
smp_versions/latest.rst
41-
42-
To find archived API documentation for the previous versions of the library,
43-
see the following link:
44-
45-
.. toctree::
46-
:maxdepth: 1
47-
48-
smp_versions/archives.rst
49-
50-
It is recommended to use this documentation alongside `SageMaker Distributed Model Parallel
51-
<http://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel.html>`__ in the Amazon SageMaker
52-
developer guide. This developer guide documentation includes:
5320

54-
- An overview of model parallelism and the library
55-
`core features <https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-core-features.html>`__
56-
- Instructions on how to modify `TensorFlow
57-
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-training-script.html#model-parallel-customize-training-script-tf>`__
58-
and `PyTorch
59-
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-training-script.html#model-parallel-customize-training-script-pt>`__
60-
training scripts
61-
- `Configuration tips and pitfalls
62-
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-tips-pitfalls.html>`__
21+
.. tip::
22+
23+
We recommended using this API documentation with the conceptual guide at
24+
`SageMaker's Distributed Model Parallel
25+
<http://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel.html>`_
26+
in the *Amazon SageMaker developer guide*. This developer guide documentation includes:
27+
28+
- An overview of model parallelism, and the library's
29+
`core features <https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-core-features.html>`_,
30+
and `extended features for PyTorch <https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-extended-features-pytorch.html>`_.
31+
- Instructions on how to modify `TensorFlow
32+
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-training-script-tf.html>`_
33+
and `PyTorch
34+
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-training-script-pt.html>`_
35+
training scripts.
36+
- Instructions on how to `run a distributed training job using the SageMaker Python SDK
37+
and the SageMaker model parallel library
38+
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-sm-sdk.html>`_.
39+
- `Configuration tips and pitfalls
40+
<https://docs.aws.amazon.com/sagemaker/latest/dg/model-parallel-customize-tips-pitfalls.html>`_.
6341

6442

6543
.. important::

doc/api/training/smd_model_parallel_general.rst

+11-3
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,15 @@
1+
#################################
2+
Use with the SageMaker Python SDK
3+
#################################
4+
5+
Walk through the following pages to learn about the library's APIs
6+
to configure and enable distributed model parallelism
7+
through an Amazon SageMaker estimator.
8+
19
.. _sm-sdk-modelparallel-params:
210

3-
Configuration Parameters for Model Parallelism
4-
==============================================
11+
Configuration Parameters for ``distribution``
12+
=============================================
513

614
Amazon SageMaker's TensorFlow and PyTorch estimator objects contain a ``distribution`` parameter,
715
which is used to enable and specify parameters for the
@@ -59,7 +67,7 @@ in the `SageMaker's Distributed Model Parallel developer guide <https://docs.aws
5967
:depth: 3
6068
:local:
6169

62-
Parameters for ``"smdistributed"``
70+
Parameters for ``smdistributed``
6371
----------------------------------
6472

6573
You can use the following parameters to initialize the library

doc/api/training/smp_versions/archives.rst

+2-2
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
.. _smdmp-pt-version-archive:
22

3-
Version Archive
4-
===============
3+
Documentation Archive
4+
=====================
55

66
.. toctree::
77
:maxdepth: 1

doc/api/training/smp_versions/latest.rst

+19
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,14 @@
1+
###############################################
2+
Use the Library's API to Adapt Training Scripts
3+
###############################################
4+
5+
The library provides Common APIs that you can use across frameworks,
6+
as well as framework-specific APIs for TensorFlow and PyTorch.
7+
8+
Select the latest or one of the previous versions of the API documentation
9+
depending on which version of the library you need to use.
10+
To use the library, reference the
11+
**Common API** documentation alongside the framework specific API documentation.
112

213
Version 1.6.0 (Latest)
314
======================
@@ -11,3 +22,11 @@ To use the library, reference the Common API documentation alongside the framewo
1122
latest/smd_model_parallel_pytorch
1223
latest/smd_model_parallel_pytorch_tensor_parallel
1324
latest/smd_model_parallel_tensorflow
25+
26+
To find archived API documentation for the previous versions of the library,
27+
see the following link:
28+
29+
.. toctree::
30+
:maxdepth: 1
31+
32+
archives

0 commit comments

Comments
 (0)