-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Add hyperparameter tuning support #207
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add hyperparameter tuning support #207
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One small thing - but happy to have this fixed post-release.
based on the training image name and current timestamp. | ||
**kwargs: Other arguments | ||
""" | ||
if isinstance(inputs, list) or isinstance(inputs, RecordSet): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is much better:
kwargs = dict(kwargs)
kwargs['job_name'] = job_name
self._prepare_for_training(**kwargs)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
basically replace line 140 to 145 with that block.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
_prepare_for_training()
still needs records
for 1P estimators but not for the others, though. Instead, it'd end up looking like:
kwargs = dict(kwargs)
kwargs['job_name'] = job_name
if isinstance(inputs, list) or isinstance(inputs, RecordSet):
kwargs['records'] = inputs
self.estimator._prepare_for_training(**kwargs)
Codecov Report
@@ Coverage Diff @@
## master #207 +/- ##
==========================================
+ Coverage 90.76% 91.49% +0.73%
==========================================
Files 42 45 +3
Lines 2717 3162 +445
==========================================
+ Hits 2466 2893 +427
- Misses 251 269 +18
Continue to review full report at Codecov.
|
a7576b4
to
2c16ae8
Compare
Description of changes:
Add support for hyperparameter tuning jobs.
This introduces a few key features:
HyperparameterTuner
, which looks/acts like an estimator withfit()
,deploy()
, andattach()
except that it creates hyperparameter tuning jobs instead of regular training jobs_prepare_for_training()
, which should set all values needed before trainingThis PR also bumps the SDK version to 1.4.0.
Merge Checklist
Put an
x
in the boxes that apply. You can also fill these out after creating the PR. If you're unsure about any of them, don't hesitate to ask. We're here to help! This is simply a reminder of what we are going to look for before merging your pull request.By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.