Skip to content

Commit 5c604bf

Browse files
BasilBeiroutiBasil Beirouti
authored andcommitted
documentation: edit to clarify how to use inference.py (aws#3194)
Co-authored-by: Basil Beirouti <[email protected]>
1 parent a359ad2 commit 5c604bf

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

doc/frameworks/tensorflow/using_tf.rst

+3-1
Original file line numberDiff line numberDiff line change
@@ -759,7 +759,7 @@ Create Python Scripts for Custom Input and Output Formats
759759
---------------------------------------------------------
760760

761761
You can add your customized Python code to process your input and output data.
762-
This customized Python code must be named ``inference.py`` and specified through the ``entry_point`` parameter:
762+
This customized Python code must be named ``inference.py`` and is specified through the ``entry_point`` parameter:
763763

764764
.. code::
765765
@@ -769,6 +769,8 @@ This customized Python code must be named ``inference.py`` and specified through
769769
model_data='s3://mybucket/model.tar.gz',
770770
role='MySageMakerRole')
771771
772+
In the example above, ``inference.py`` is assumed to be a file inside ``model.tar.gz``. If you want to use a local file instead, you must add the ``source_dir`` argument. See the documentation on `TensorFlowModel <https://sagemaker.readthedocs.io/en/stable/frameworks/tensorflow/sagemaker.tensorflow.html#sagemaker.tensorflow.model.TensorFlowModel>`_.
773+
772774
How to implement the pre- and/or post-processing handler(s)
773775
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
774776

0 commit comments

Comments
 (0)