Skip to content

Commit 4ceff09

Browse files
committed
minor doc fix for smddp
1 parent 8cfabd0 commit 4ceff09

File tree

1 file changed

+3
-14
lines changed

1 file changed

+3
-14
lines changed

doc/api/training/smd_data_parallel_release_notes/smd_data_parallel_change_log.rst

Lines changed: 3 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -43,8 +43,9 @@ SageMaker Distributed Data Parallel 1.4.0 Release Notes
4343

4444
**Improvements**
4545

46-
* Support AllReduce Large Tensors
47-
* we support the following new arguments in the `PyTorch DDP class <https://pytorch.org/docs/stable/generated/torch.nn.parallel.DistributedDataParallel.html#torch.nn.parallel.DistributedDataParallel>`_.
46+
* Support for AllReduce Large Tensors
47+
* Support for the following new arguments in the `PyTorch DDP class
48+
<https://pytorch.org/docs/stable/generated/torch.nn.parallel.DistributedDataParallel.html#torch.nn.parallel.DistributedDataParallel>`_.
4849

4950
* ``broadcast_buffers``
5051
* ``find_unused_parameters``
@@ -70,18 +71,6 @@ This version passed benchmark testing and is migrated to the following AWS Deep
7071
7172
763104351884.dkr.ecr.<region>.amazonaws.com/pytorch-training:1.10.2-gpu-py38-cu113-ubuntu20.04-sagemaker
7273
73-
- PyTorch 1.10.0 DLC
74-
75-
.. code::
76-
77-
763104351884.dkr.ecr.<region>.amazonaws.com/pytorch-training:1.10.0-gpu-py38-cu113-ubuntu20.04-sagemaker
78-
79-
- PyTorch 1.9.1 DLC
80-
81-
.. code::
82-
83-
763104351884.dkr.ecr.<region>.amazonaws.com/pytorch-training:1.9.1-gpu-py38-cu111-ubuntu20.04
84-
8574
8675
----
8776

0 commit comments

Comments
 (0)