Skip to content

Add support for hyperparameter tuning jobs #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 61 commits into from
May 31, 2018

Conversation

laurenyu
Copy link
Owner

No description provided.

nadiaya and others added 30 commits March 27, 2018 22:12
Fix whitespaces.
Make python3 to be default for pytorch estimator.
Add Pytorch estimator and model
…sion based on the current environment python version.
Add integ tests for pytorch prediction.
aws#11 updated master to reflect the public SDK. This change brings this branch up to date.
… method (aws#15)

* Refactor EstimatorBase and Framework to have a prepare_for_training() method

* Specify argument directly instead of using **kwargs
This is following the series of changes for introducing a
prepare_for_training() function. This also includes a small change to
allow RecordSet to generate the input channel format expected by fit()
laurenyu and others added 28 commits May 23, 2018 16:45
Since publicly this is being known as "hyperparameter tuning" (or some variation of that), it doesn't make sense that we'd use "hpo" everywhere
* Initial checkin of SageMaker HPO Analytics library.
Follow-up to aws#39, which fixed the circular dependency
We originally weren't honoring job names passed through fit().
This change fixes that.
@laurenyu laurenyu merged commit 42974a2 into hyperparameter-tuning-support May 31, 2018
@laurenyu laurenyu deleted the hpo branch May 31, 2018 18:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants