Skip to content

Commit 0e70caa

Browse files
committed
minor fix
1 parent ed33ed1 commit 0e70caa

File tree

1 file changed

+9
-9
lines changed

1 file changed

+9
-9
lines changed

doc/api/training/smd_data_parallel_release_notes/smd_data_parallel_change_log.rst

+9-9
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
.. _sdp_1.2.2_release_note:
22

3-
Sagemaker Distributed Data Parallel 1.2.2 Release Notes
3+
SageMaker Distributed Data Parallel 1.2.2 Release Notes
44
=======================================================
55

66
*Date: November. 24. 2021*
@@ -35,7 +35,7 @@ This version passed benchmark testing and is migrated to the following AWS Deep
3535
Release History
3636
===============
3737

38-
Sagemaker Distributed Data Parallel 1.2.1 Release Notes
38+
SageMaker Distributed Data Parallel 1.2.1 Release Notes
3939
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
4040

4141
*Date: June. 29. 2021*
@@ -66,7 +66,7 @@ This version passed benchmark testing and is migrated to the following AWS Deep
6666
763104351884.dkr.ecr.<region>.amazonaws.com/tensorflow-training:2.5.0-gpu-py37-cu112-ubuntu18.04-v1.0
6767
6868
69-
Sagemaker Distributed Data Parallel 1.2.0 Release Notes
69+
SageMaker Distributed Data Parallel 1.2.0 Release Notes
7070
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
7171

7272
- New features
@@ -79,15 +79,15 @@ Sagemaker Distributed Data Parallel 1.2.0 Release Notes
7979
AllReduce. For best performance, it is recommended you use an
8080
instance type that supports Amazon Elastic Fabric Adapter
8181
(ml.p3dn.24xlarge and ml.p4d.24xlarge) when you train a model using
82-
Sagemaker Distributed data parallel.
82+
SageMaker Distributed data parallel.
8383

8484
**Bug Fixes:**
8585

8686
- Improved performance on single node and small clusters.
8787

8888
----
8989

90-
Sagemaker Distributed Data Parallel 1.1.2 Release Notes
90+
SageMaker Distributed Data Parallel 1.1.2 Release Notes
9191
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
9292

9393
- Bug Fixes
@@ -101,15 +101,15 @@ Sagemaker Distributed Data Parallel 1.1.2 Release Notes
101101

102102
**Known Issues:**
103103

104-
- Sagemaker Distributed data parallel has slower throughput than NCCL
104+
- SageMaker Distributed data parallel has slower throughput than NCCL
105105
when run using a single node. For the best performance, use
106106
multi-node distributed training with smdistributed.dataparallel. Use
107107
a single node only for experimental runs while preparing your
108108
training pipeline.
109109

110110
----
111111

112-
Sagemaker Distributed Data Parallel 1.1.1 Release Notes
112+
SageMaker Distributed Data Parallel 1.1.1 Release Notes
113113
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
114114

115115
- New Features
@@ -136,7 +136,7 @@ Sagemaker Distributed Data Parallel 1.1.1 Release Notes
136136

137137
----
138138

139-
Sagemaker Distributed Data Parallel 1.1.0 Release Notes
139+
SageMaker Distributed Data Parallel 1.1.0 Release Notes
140140
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
141141

142142
- New Features
@@ -172,7 +172,7 @@ SDK Guide
172172

173173
----
174174

175-
Sagemaker Distributed Data Parallel 1.0.0 Release Notes
175+
SageMaker Distributed Data Parallel 1.0.0 Release Notes
176176
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
177177

178178
- First Release

0 commit comments

Comments
 (0)