Skip to content

Commit be73e59

Browse files
author
Talia Chopra
committed
SM Distributed: adding new verisons for DLC launch
1 parent efb2513 commit be73e59

File tree

6 files changed

+20
-22
lines changed

6 files changed

+20
-22
lines changed

doc/api/training/sdp_versions/latest/smd_data_parallel_tensorflow.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -157,7 +157,7 @@ TensorFlow API
157157

158158
.. rubric:: Supported versions
159159

160-
**TensorFlow 2.4.1**
160+
**TensorFlow 2.3.1, 2.4.1**
161161

162162
.. function:: smdistributed.dataparallel.tensorflow.init()
163163

doc/api/training/sdp_versions/v1.1.x/smd_data_parallel_pytorch.rst

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,11 @@ PyTorch Guide to SageMaker's distributed data parallel library
44

55
.. admonition:: Contents
66

7-
- :ref:`pytorch-sdp-modify`
8-
- :ref:`pytorch-sdp-api`
7+
- :ref:`pytorch-sdp-modify-11x`
8+
- :ref:`pytorch-sdp-api-11x`
9+
10+
.. _pytorch-sdp-modify-11x:
911

10-
.. _pytorch-sdp-modify:
11-
:noindex:
1212

1313
Modify a PyTorch training script to use SageMaker data parallel
1414
======================================================================
@@ -149,15 +149,15 @@ you will have for distributed training with the distributed data parallel librar
149149
    main()
150150
151151
152-
.. _pytorch-sdp-api-1.1.x:
153-
:noindex:
152+
.. _pytorch-sdp-api-11x:
153+
154154

155155
PyTorch API
156156
===========
157157

158158
.. rubric:: Supported versions
159159

160-
**PyTorch 1.7.1, 1.8.0**
160+
**PyTorch 1.7.1, 1.8.1**
161161

162162

163163
.. function:: smdistributed.dataparallel.torch.distributed.is_available()

doc/api/training/sdp_versions/v1.1.x/smd_data_parallel_tensorflow.rst

Lines changed: 8 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,11 @@ TensorFlow Guide to SageMaker's distributed data parallel library
44

55
.. admonition:: Contents
66

7-
- :ref:`tensorflow-sdp-modify`
8-
- :ref:`tensorflow-sdp-api`
7+
- :ref:`tensorflow-sdp-modify-11x`
8+
- :ref:`tensorflow-sdp-api-11x`
9+
10+
.. _tensorflow-sdp-modify-11x:
911

10-
.. _tensorflow-sdp-modify:
11-
:noindex:
1212

1313
Modify a TensorFlow 2.x training script to use SageMaker data parallel
1414
======================================================================
@@ -151,18 +151,16 @@ script you will have for distributed training with the library.
151151
    checkpoint.save(checkpoint_dir)
152152
153153
154-
.. _tensorflow-sdp-api:
155-
:noindex:
154+
.. _tensorflow-sdp-api-11x:
155+
156156

157157
TensorFlow API
158158
==============
159159

160160
.. rubric:: Supported versions
161161

162-
TensorFlow is supported in version 1.0.0 of ``sagemakerdistributed.dataparallel``.
163-
Reference version 1.0.0 `TensorFlow API documentation
164-
<https://sagemaker.readthedocs.io/en/stable/api/training/sdp_versions/latest/smd_data_parallel_tensorflow.html#tensorflow-sdp-api>`_
165-
for supported TensorFlow versions.
162+
Use version 1.0.0 or version 1.2.0 or later of ``smdistributed.dataparallel`` to use this
163+
library with TensorFlow.
166164

167165
.. function:: smdistributed.dataparallel.tensorflow.init()
168166
:noindex:

doc/api/training/smp_versions/latest.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11

2-
Version 1.3.0 (Latest)
2+
Version 1.3.x (Latest)
33
======================
44

55
To use the library, reference the Common API documentation alongside the framework specific API documentation.

doc/api/training/smp_versions/latest/smd_model_parallel_pytorch.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
PyTorch API
77
===========
88

9-
**Supported versions: 1.6.0, 1.7.1, 1.8.0**
9+
**Supported versions: 1.7.1, 1.8.1**
1010

1111
This API document assumes you use the following import statements in your training scripts.
1212

@@ -268,7 +268,7 @@ This API document assumes you use the following import statements in your traini
268268

269269
.. function:: register_comm_hook( state, callable )
270270

271-
**Available for PyTorch 1.8.0 only**
271+
**Available for PyTorch 1.8.1 only**
272272
Registers a communication hook which is an enhancement that provides
273273
a flexible hook ``callable`` to users where they can specify how
274274
gradients are aggregated across multiple workers. This method will be called on the wrapped ``DistributedDataParallel`` instance.

doc/api/training/smp_versions/latest/smd_model_parallel_tensorflow.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
TensorFlow API
22
==============
33

4-
**Supported version: 2.4.1, 2.3.1**
4+
**Supported version: 2.3.1, 2.4.1**
55

66
**Important**: This API document assumes you use the following import statement in your training scripts.
77

0 commit comments

Comments
 (0)