Skip to content

Commit 3008a29

Browse files
authored
Add more detail to README for automatic model tuning (#225)
1 parent 3fb5516 commit 3008a29

File tree

1 file changed

+29
-3
lines changed

1 file changed

+29
-3
lines changed

README.rst

Lines changed: 29 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -270,10 +270,12 @@ SageMaker Automatic Model Tuning
270270

271271
All of the estimators can be used with SageMaker Automatic Model Tuning, which performs hyperparameter tuning jobs.
272272
A hyperparameter tuning job runs multiple training jobs that differ by the values of their hyperparameters to find the best training job.
273-
The SageMaker Python SDK contains a ``HyperparameterTuner`` class for creating and interacting with hyperparameter training jobs.
273+
It then chooses the hyperparameter values that result in a model that performs the best, as measured by a metric that you choose.
274+
If you're not using an Amazon ML algorithm, then the metric is defined by a regular expression (regex) you provide for going through the training job's logs.
274275
You can read more about SageMaker Automatic Model Tuning in the `AWS documentation <https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning.html>`__.
275276

276-
Here is a basic example of how to use ``HyperparameterTuner`` to start tuning jobs instead of using an estimator to start training jobs:
277+
The SageMaker Python SDK contains a ``HyperparameterTuner`` class for creating and interacting with hyperparameter training jobs.
278+
Here is a basic example of how to use it:
277279

278280
.. code:: python
279281
@@ -299,6 +301,26 @@ Here is a basic example of how to use ``HyperparameterTuner`` to start tuning jo
299301
# Tear down the SageMaker endpoint
300302
my_tuner.delete_endpoint()
301303
304+
This example shows a hyperparameter tuning job that creates up to 100 training jobs, running up to 10 at a time.
305+
Each training job's learning rate will be a value between 0.05 and 0.06, but this value will differ between training jobs.
306+
You can read more about how these values are chosen in the `AWS documentation <https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning-how-it-works.html>`__.
307+
308+
A hyperparameter range can be one of three types: continuous, integer, or categorical.
309+
The SageMaker Python SDK provides corresponding classes for defining these different types.
310+
You can define up to 20 hyperparameters to search over, but each value of a categorical hyperparameter range counts against that limit.
311+
312+
If you are using an Amazon ML algorithm, you don't need to pass in anything for ``metric_definitions``.
313+
In addition, the ``fit()`` call uses a list of ``RecordSet`` objects instead of a dictionary:
314+
315+
.. code:: python
316+
317+
# Create RecordSet object for each data channel
318+
train_records = RecordSet(...)
319+
test_records = RecordSet(...)
320+
321+
# Start hyperparameter tuning job
322+
my_tuner.fit([train_records, test_records])
323+
302324
There is also an analytics object associated with each ``HyperparameterTuner`` instance that presents useful information about the hyperparameter tuning job.
303325
For example, the ``dataframe`` method gets a pandas dataframe summarizing the associated training jobs:
304326

@@ -310,7 +332,11 @@ For example, the ``dataframe`` method gets a pandas dataframe summarizing the as
310332
# Look at summary of associated training jobs
311333
my_dataframe = my_tuner_analytics.dataframe()
312334
313-
For more detailed examples of running hyperparameter tuning jobs, see: https://github.com/awslabs/amazon-sagemaker-examples.
335+
For more detailed examples of running hyperparameter tuning jobs, see:
336+
337+
- `Using the TensorFlow estimator with hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/tensorflow_mnist/hpo_tensorflow_mnist.ipynb>`__
338+
- `Bringing your own estimator for hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/r_bring_your_own/hpo_r_bring_your_own.ipynb>`__
339+
- `Analyzing results <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/analyze_results/HPO_Analyze_TuningJob_Results.ipynb>`__
314340

315341
For more detailed explanations of the classes that this library provides for automatic model tuning, see:
316342

0 commit comments

Comments
 (0)