Skip to content

Commit 47439f7

Browse files
TEChopra1000aaronmarkham
authored andcommitted
doc: minor updates to doc strings (#520)
Co-authored-by: Aaron Markham <[email protected]>
1 parent e23e44e commit 47439f7

File tree

3 files changed

+9
-9
lines changed

3 files changed

+9
-9
lines changed

src/sagemaker/fw_utils.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -77,14 +77,14 @@ def validate_source_dir(script, directory):
7777

7878

7979
def get_mp_parameters(distribution):
80-
"""Get the model parallelism parameters provided by the user
80+
"""Get the model parallelism parameters provided by the user.
8181
8282
Args:
83-
distribution: distribution dictionary defined by the user
83+
distribution: distribution dictionary defined by the user.
8484
8585
Returns:
8686
params: dictionary containing model parallelism parameters
87-
to be used for training
87+
used for training.
8888
"""
8989
try:
9090
mp_dict = distribution["smdistributed"]["modelparallel"]

src/sagemaker/pytorch/estimator.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -103,9 +103,9 @@ def __init__(
103103
``image_uri`` is required. If also ``None``, then a ``ValueError``
104104
will be raised.
105105
distribution (dict): A dictionary with information on how to run distributed training
106-
(default: None). Currently we support distributed training with parameter servers,
107-
Model Parallelism, Data Parallelism, and MPI. Model Parallelism can only be used
108-
with MPI.
106+
(default: None). Currently, the following are supported:
107+
distributed training with parameter servers, SageMaker Distributed (SMD) Data
108+
and Model Parallelism, and MPI. SMD Model Parallelism can only be used with MPI.
109109
To enable parameter server use the following setup:
110110
111111
.. code:: python

src/sagemaker/tensorflow/estimator.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -81,9 +81,9 @@ def __init__(
8181
``image_uri`` is required. If also ``None``, then a ``ValueError``
8282
will be raised.
8383
distribution (dict): A dictionary with information on how to run distributed training
84-
(default: None). Currently we support distributed training with parameter servers,
85-
Model Parallelism, Data Parallelism, and MPI. Model Parallelism can only be used
86-
with MPI.
84+
(default: None). Currently, the following are supported:
85+
distributed training with parameter servers, SageMaker Distributed (SMD) Data
86+
and Model Parallelism, and MPI. SMD Model Parallelism can only be used with MPI.
8787
To enable parameter server use the following setup:
8888
8989
.. code:: python

0 commit comments

Comments
 (0)