Skip to content

Commit d981ec3

Browse files
committed
remove dist word embedding option
1 parent 060f681 commit d981ec3

File tree

1 file changed

+2
-5
lines changed

1 file changed

+2
-5
lines changed

doc/api/training/smp_versions/latest/smd_model_parallel_pytorch.rst

+2-5
Original file line numberDiff line numberDiff line change
@@ -498,7 +498,7 @@ smdistributed.modelparallel.torch.DistributedOptimizer
498498
smdistributed.modelparallel.torch Context Managers and Util Functions
499499
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
500500
501-
.. function:: smdistributed.modelparallel.torch.model_creation(tensor_parallelism=False, dtype=None, distribute_embedding=False, **tensor_parallel_config)
501+
.. function:: smdistributed.modelparallel.torch.model_creation(tensor_parallelism=False, dtype=None, **tensor_parallel_config)
502502
503503
Context manager to create a ``torch`` model. This API combines both the
504504
:class:`smdistributed.modelparallel.torch.tensor_parallelism` and
@@ -522,8 +522,6 @@ smdistributed.modelparallel.torch Context Managers and Util Functions
522522
in the *Amazon SageMaker Developer Guide*.
523523

524524
:type dtype: ``torch.dtype``
525-
:param distribute_embedding: Whether to enable vocabulary parallelism for NLP models.
526-
:type distribute_embedding: boolean
527525
:param tensor_parallel_config: kwargs to specifiy other tensor parallel configs.
528526
This is not used if ``tensor_parallelism`` is ``False``.
529527
:type tensor_parallel_config: dict
@@ -536,8 +534,7 @@ smdistributed.modelparallel.torch Context Managers and Util Functions
536534
537535
with smp.model_creation(
538536
tensor_parallelism=smp.tp_size() > 1,
539-
dtype=torch.float16 if args.fp16 else torch.get_default_dtype(),
540-
distribute_embedding=False
537+
dtype=torch.float16 if args.fp16 else torch.get_default_dtype()
541538
):
542539
model = MyModel(...)
543540

0 commit comments

Comments
 (0)