You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
All of the estimators can be used with SageMaker Automatic Model Tuning, which performs hyperparameter tuning jobs.
272
+
A hyperparameter tuning job runs multiple training jobs that differ by the values of their hyperparameters to find the best training job.
273
+
It then chooses the hyperparameter values that result in a model that performs the best, as measured by a metric that you choose.
274
+
If you're not using an Amazon ML algorithm, then the metric is defined by a regular expression (regex) you provide for going through the training job's logs.
275
+
You can read more about SageMaker Automatic Model Tuning in the `AWS documentation <https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning.html>`__.
276
+
277
+
The SageMaker Python SDK contains a ``HyperparameterTuner`` class for creating and interacting with hyperparameter training jobs.
278
+
Here is a basic example of how to use it:
279
+
280
+
.. code:: python
281
+
282
+
from sagemaker.tuner import HyperparameterTuner, ContinuousParameter
This example shows a hyperparameter tuning job that creates up to 100 training jobs, running up to 10 at a time.
305
+
Each training job's learning rate will be a value between 0.05 and 0.06, but this value will differ between training jobs.
306
+
You can read more about how these values are chosen in the `AWS documentation <https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning-how-it-works.html>`__.
307
+
308
+
A hyperparameter range can be one of three types: continuous, integer, or categorical.
309
+
The SageMaker Python SDK provides corresponding classes for defining these different types.
310
+
You can define up to 20 hyperparameters to search over, but each value of a categorical hyperparameter range counts against that limit.
311
+
312
+
If you are using an Amazon ML algorithm, you don't need to pass in anything for ``metric_definitions``.
313
+
In addition, the ``fit()`` call uses a list of ``RecordSet`` objects instead of a dictionary:
314
+
315
+
.. code:: python
316
+
317
+
# Create RecordSet object for each data channel
318
+
train_records = RecordSet(...)
319
+
test_records = RecordSet(...)
320
+
321
+
# Start hyperparameter tuning job
322
+
my_tuner.fit([train_records, test_records])
323
+
324
+
There is also an analytics object associated with each ``HyperparameterTuner`` instance that presents useful information about the hyperparameter tuning job.
325
+
For example, the ``dataframe`` method gets a pandas dataframe summarizing the associated training jobs:
326
+
327
+
.. code:: python
328
+
329
+
# Retrieve analytics object
330
+
my_tuner_analytics = my_tuner.analytics()
331
+
332
+
# Look at summary of associated training jobs
333
+
my_dataframe = my_tuner_analytics.dataframe()
334
+
335
+
For more detailed examples of running hyperparameter tuning jobs, see:
336
+
337
+
- `Using the TensorFlow estimator with hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/tensorflow_mnist/hpo_tensorflow_mnist.ipynb>`__
338
+
- `Bringing your own estimator for hyperparameter tuning <https://github.com/awslabs/amazon-sagemaker-examples/blob/master/hyperparameter_tuning/r_bring_your_own/hpo_r_bring_your_own.ipynb>`__
Copy file name to clipboardExpand all lines: doc/index.rst
+3-1Lines changed: 3 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ Amazon SageMaker Python SDK is an open source library for training and deploying
4
4
5
5
With the SDK, you can train and deploy models using popular deep learning frameworks: **Apache MXNet** and **TensorFlow**. You can also train and deploy models with **algorithms provided by Amazon**, these are scalable implementations of core machine learning algorithms that are optimized for SageMaker and GPU training. If you have **your own algorithms** built into SageMaker-compatible Docker containers, you can train and host models using these as well.
6
6
7
-
Here you'll find API docs for SageMaker Python SDK. The project home-page is in Github: https://github.com/aws/sagemaker-python-sdk, there you can find the SDK source, installation instructions and a general overview of the library there.
7
+
Here you'll find API docs for SageMaker Python SDK. The project home-page is in Github: https://github.com/aws/sagemaker-python-sdk, there you can find the SDK source, installation instructions and a general overview of the library there.
8
8
9
9
Overview
10
10
----------
@@ -14,9 +14,11 @@ The SageMaker Python SDK consists of a few primary interfaces:
0 commit comments