Skip to content

Add docs on TensorFlowModel class usage + requirements file for serving #393

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Sep 19, 2018
Merged
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 33 additions & 0 deletions src/sagemaker/tensorflow/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -647,6 +647,39 @@ When the ``deploy`` call finishes, the created SageMaker Endpoint is ready for p
how to make predictions against the Endpoint, how to use different content-types in your requests, and how to extend the Web server
functionality.

Deploying directly from model artifacts
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

If you already have existing model artifacts, you can skip training and deploy them directly to an endpoint:

.. code:: python

from sagemaker.tensorflow import TensorFlowModel

tf_model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz',
role='MySageMakerRole',
entry_point='entry.py',
name='model_name')

predictor = tf_model.deploy(initial_instance_count=1, instance_type='ml.c4.xlarge')

You can also optionally specify a pip requirements if you need to install additional packages into the deployed
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

s/requirements/requirements file - might also be helpful to link to something like https://pip.pypa.io/en/stable/reference/pip_install/#requirements-file-format

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea, done

runtime environment by including it in your source_dir and specifying it in the 'SAGEMAKER_REQUIREMENTS' env variable:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: I'd make 'SAGEMAKER_REQUIREMENTS' monospaced

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


.. code:: python

from sagemaker.tensorflow import TensorFlowModel

tf_model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz',
role='MySageMakerRole',
entry_point='entry.py',
source_dir='my_src', # directory which contains entry_point script and requirements file
name='model_name',
env={'SAGEMAKER_REQUIREMENTS': 'requirements.txt'} # path relative to source_dir
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: I'd put the closing parenthesis on the previous line to be consistent with the other code snippet


predictor = tf_model.deploy(initial_instance_count=1, instance_type='ml.c4.xlarge')


Making predictions against a SageMaker Endpoint
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand Down