-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Add docs on TensorFlowModel class usage + requirements file for serving #393
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 1 commit
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -647,6 +647,39 @@ When the ``deploy`` call finishes, the created SageMaker Endpoint is ready for p | |
how to make predictions against the Endpoint, how to use different content-types in your requests, and how to extend the Web server | ||
functionality. | ||
|
||
Deploying directly from model artifacts | ||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ||
|
||
If you already have existing model artifacts, you can skip training and deploy them directly to an endpoint: | ||
|
||
.. code:: python | ||
|
||
from sagemaker.tensorflow import TensorFlowModel | ||
|
||
tf_model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', | ||
role='MySageMakerRole', | ||
entry_point='entry.py', | ||
name='model_name') | ||
|
||
predictor = tf_model.deploy(initial_instance_count=1, instance_type='ml.c4.xlarge') | ||
|
||
You can also optionally specify a pip requirements if you need to install additional packages into the deployed | ||
runtime environment by including it in your source_dir and specifying it in the 'SAGEMAKER_REQUIREMENTS' env variable: | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nit: I'd make There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Done |
||
|
||
.. code:: python | ||
|
||
from sagemaker.tensorflow import TensorFlowModel | ||
|
||
tf_model = TensorFlowModel(model_data='s3://mybucket/model.tar.gz', | ||
role='MySageMakerRole', | ||
entry_point='entry.py', | ||
source_dir='my_src', # directory which contains entry_point script and requirements file | ||
name='model_name', | ||
env={'SAGEMAKER_REQUIREMENTS': 'requirements.txt'} # path relative to source_dir | ||
) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. nit: I'd put the closing parenthesis on the previous line to be consistent with the other code snippet |
||
|
||
predictor = tf_model.deploy(initial_instance_count=1, instance_type='ml.c4.xlarge') | ||
|
||
|
||
Making predictions against a SageMaker Endpoint | ||
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
s/requirements/requirements file - might also be helpful to link to something like https://pip.pypa.io/en/stable/reference/pip_install/#requirements-file-format
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good idea, done