Skip to content

Commit 8bbacae

Browse files
authored
Merge pull request aws#180 from matthieudelaro/master
Fix missing word
2 parents 7d48a99 + e08a7b7 commit 8bbacae

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

sagemaker-python-sdk/tensorflow_distributed_mnist/tensorflow_distributed_mnist.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -120,7 +120,7 @@
120120
"When distributed training happens, the same neural network will be sent to the multiple training instances. Each instance will predict a batch of the dataset, calculate loss and minimize the optimizer. One entire loop of this process is called **training step**.\n",
121121
"\n",
122122
"### Syncronizing training steps\n",
123-
"A [global step](https://www.tensorflow.org/api_docs/python/tf/train/global_step) is a global variable shared between the instances. It necessary for distributed training, so the optimizer will keep track of the number of **training steps** between runs: \n",
123+
"A [global step](https://www.tensorflow.org/api_docs/python/tf/train/global_step) is a global variable shared between the instances. It's necessary for distributed training, so the optimizer will keep track of the number of **training steps** between runs: \n",
124124
"\n",
125125
"```python\n",
126126
"train_op = optimizer.minimize(loss, tf.train.get_or_create_global_step())\n",

0 commit comments

Comments
 (0)