diff --git a/README.rst b/README.rst index f52a76d45e..72cb766ad3 100644 --- a/README.rst +++ b/README.rst @@ -270,10 +270,12 @@ SageMaker Automatic Model Tuning All of the estimators can be used with SageMaker Automatic Model Tuning, which performs hyperparameter tuning jobs. A hyperparameter tuning job runs multiple training jobs that differ by the values of their hyperparameters to find the best training job. -The SageMaker Python SDK contains a ``HyperparameterTuner`` class for creating and interacting with hyperparameter training jobs. +It then chooses the hyperparameter values that result in a model that performs the best, as measured by a metric that you choose. +If you're not using an Amazon ML algorithm, then the metric is defined by a regular expression (regex) you provide for going through the training job's logs. You can read more about SageMaker Automatic Model Tuning in the `AWS documentation `__. -Here is a basic example of how to use ``HyperparameterTuner`` to start tuning jobs instead of using an estimator to start training jobs: +The SageMaker Python SDK contains a ``HyperparameterTuner`` class for creating and interacting with hyperparameter training jobs. +Here is a basic example of how to use it: .. code:: python @@ -299,6 +301,26 @@ Here is a basic example of how to use ``HyperparameterTuner`` to start tuning jo # Tear down the SageMaker endpoint my_tuner.delete_endpoint() +This example shows a hyperparameter tuning job that creates up to 100 training jobs, running up to 10 at a time. +Each training job's learning rate will be a value between 0.05 and 0.06, but this value will differ between training jobs. +You can read more about how these values are chosen in the `AWS documentation `__. + +A hyperparameter range can be one of three types: continuous, integer, or categorical. +The SageMaker Python SDK provides corresponding classes for defining these different types. +You can define up to 20 hyperparameters to search over, but each value of a categorical hyperparameter range counts against that limit. + +If you are using an Amazon ML algorithm, you don't need to pass in anything for ``metric_definitions``. +In addition, the ``fit()`` call uses a list of ``RecordSet`` objects instead of a dictionary: + +.. code:: python + + # Create RecordSet object for each data channel + train_records = RecordSet(...) + test_records = RecordSet(...) + + # Start hyperparameter tuning job + my_tuner.fit([train_records, test_records]) + There is also an analytics object associated with each ``HyperparameterTuner`` instance that presents useful information about the hyperparameter tuning job. For example, the ``dataframe`` method gets a pandas dataframe summarizing the associated training jobs: @@ -310,7 +332,11 @@ For example, the ``dataframe`` method gets a pandas dataframe summarizing the as # Look at summary of associated training jobs my_dataframe = my_tuner_analytics.dataframe() -For more detailed examples of running hyperparameter tuning jobs, see: https://github.com/awslabs/amazon-sagemaker-examples. +For more detailed examples of running hyperparameter tuning jobs, see: + +- `Using the TensorFlow estimator with hyperparameter tuning `__ +- `Bringing your own estimator for hyperparameter tuning `__ +- `Analyzing results `__ For more detailed explanations of the classes that this library provides for automatic model tuning, see: