From 22332eaa403544b16e3ef3e72b7b0386f0113bba Mon Sep 17 00:00:00 2001 From: Basil Beirouti Date: Fri, 24 Jun 2022 16:56:35 -0700 Subject: [PATCH] doc edit for tensorflow serving containers --- doc/frameworks/tensorflow/using_tf.rst | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/doc/frameworks/tensorflow/using_tf.rst b/doc/frameworks/tensorflow/using_tf.rst index bd6cd36dcf..1e51b5f43a 100644 --- a/doc/frameworks/tensorflow/using_tf.rst +++ b/doc/frameworks/tensorflow/using_tf.rst @@ -759,7 +759,7 @@ Create Python Scripts for Custom Input and Output Formats --------------------------------------------------------- You can add your customized Python code to process your input and output data. -This customized Python code must be named ``inference.py`` and specified through the ``entry_point`` parameter: +This customized Python code must be named ``inference.py`` and is specified through the ``entry_point`` parameter: .. code:: @@ -769,6 +769,8 @@ This customized Python code must be named ``inference.py`` and specified through model_data='s3://mybucket/model.tar.gz', role='MySageMakerRole') +In the example above, ``inference.py`` is assumed to be a file inside ``model.tar.gz``. If you want to use a local file instead, you must add the ``source_dir`` argument. See the documentation on `TensorFlowModel `_. + How to implement the pre- and/or post-processing handler(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^