You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.rst
+29-3Lines changed: 29 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -270,10 +270,12 @@ SageMaker Automatic Model Tuning
270
270
271
271
All of the estimators can be used with SageMaker Automatic Model Tuning, which performs hyperparameter tuning jobs.
272
272
A hyperparameter tuning job runs multiple training jobs that differ by the values of their hyperparameters to find the best training job.
273
-
The SageMaker Python SDK contains a ``HyperparameterTuner`` class for creating and interacting with hyperparameter training jobs.
273
+
It then chooses the hyperparameter values that result in a model that performs the best, as measured by a metric that you choose.
274
+
If you're not using an Amazon ML algorithm, then the metric is defined by a regular expression (regex) you provide for going through the training job's logs.
274
275
You can read more about SageMaker Automatic Model Tuning in the `AWS documentation <https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning.html>`__.
275
276
276
-
Here is a basic example of how to use ``HyperparameterTuner`` to start tuning jobs instead of using an estimator to start training jobs:
277
+
The SageMaker Python SDK contains a ``HyperparameterTuner`` class for creating and interacting with hyperparameter training jobs.
278
+
Here is a basic example of how to use it:
277
279
278
280
.. code:: python
279
281
@@ -299,6 +301,26 @@ Here is a basic example of how to use ``HyperparameterTuner`` to start tuning jo
299
301
# Tear down the SageMaker endpoint
300
302
my_tuner.delete_endpoint()
301
303
304
+
This example shows a hyperparameter tuning job that creates up to 100 training jobs, running up to 10 at a time.
305
+
Each training job's learning rate will be a value between 0.05 and 0.06, but this value will differ between training jobs.
306
+
You can read more about how these values are chosen in the `AWS documentation <https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning-how-it-works.html>`__.
307
+
308
+
A hyperparameter range can be one of three types: continuous, integer, or categorical.
309
+
The SageMaker Python SDK provides corresponding classes for defining these different types.
310
+
You can define up to 20 hyperparameters to search over, but each value of a categorical hyperparameter range counts against that limit.
311
+
312
+
If you are using an Amazon ML algorithm, you don't need to pass in anything for ``metric_definitions``.
313
+
In addition, the ``fit()`` call uses a list of ``RecordSet`` objects instead of a dictionary:
314
+
315
+
.. code:: python
316
+
317
+
# Create RecordSet object for each data channel
318
+
train_records = RecordSet(...)
319
+
test_records = RecordSet(...)
320
+
321
+
# Start hyperparameter tuning job
322
+
my_tuner.fit([train_records, test_records])
323
+
302
324
There is also an analytics object associated with each ``HyperparameterTuner`` instance that presents useful information about the hyperparameter tuning job.
303
325
For example, the ``dataframe`` method gets a pandas dataframe summarizing the associated training jobs:
304
326
@@ -310,7 +332,11 @@ For example, the ``dataframe`` method gets a pandas dataframe summarizing the as
310
332
# Look at summary of associated training jobs
311
333
my_dataframe = my_tuner_analytics.dataframe()
312
334
313
-
For more detailed examples of running hyperparameter tuning jobs, see: https://github.com/awslabs/amazon-sagemaker-examples.
335
+
For more detailed examples of running hyperparameter tuning jobs, see:
336
+
337
+
- `Using the TensorFlow estimator with hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/tensorflow_mnist/hpo_tensorflow_mnist.ipynb>`__
338
+
- `Bringing your own estimator for hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/r_bring_your_own/hpo_r_bring_your_own.ipynb>`__
0 commit comments