Skip to content

Add docs on TensorFlowModel class usage + requirements file for serving #393

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Sep 19, 2018
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 32 additions & 0 deletions src/sagemaker/tensorflow/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -647,6 +647,38 @@ When the ``deploy`` call finishes, the created SageMaker Endpoint is ready for p
how to make predictions against the Endpoint, how to use different content-types in your requests, and how to extend the Web server
functionality.

Deploying directly from model artifacts
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

If you already have existing model artifacts, you can skip training and deploy them directly to an endpoint:

.. code:: python
from sagemaker.tensorflow import TensorFlowModel
tf_model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz',
role='MySageMakerRole',
entry_point='entry.py',
name='model_name')
predictor = tf_model.deploy(initial_instance_count=1, instance_type='ml.c4.xlarge')
You can also optionally specify a pip `requirements file <https://pip.pypa.io/en/stable/reference/pip_install/#requirements-file-format>`_ if you need to install additional packages into the deployed
runtime environment by including it in your source_dir and specifying it in the ``'SAGEMAKER_REQUIREMENTS'`` env variable:

.. code:: python
from sagemaker.tensorflow import TensorFlowModel
tf_model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz',
role='MySageMakerRole',
entry_point='entry.py',
source_dir='my_src', # directory which contains entry_point script and requirements file
name='model_name',
env={'SAGEMAKER_REQUIREMENTS': 'requirements.txt'}) # path relative to source_dir
predictor = tf_model.deploy(initial_instance_count=1, instance_type='ml.c4.xlarge')
Making predictions against a SageMaker Endpoint
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand Down