Skip to content

Unexpected keyword argument 'strategy_config' at Session._map_tuning_config() #3452

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
yshen92 opened this issue Nov 3, 2022 · 8 comments
Closed

Comments

@yshen92
Copy link

yshen92 commented Nov 3, 2022

Describe the bug
Fitting a Hyperband HyperparameterTuner setup with strategy_config causes error at model fit.

To reproduce

from sagemaker.tuner import (
    ContinuousParameter,
    HyperparameterTuner,
    HyperbandStrategyConfig
)

hyperparameter_ranges = {
    'learning_rate': ContinuousParameter(0.0001, 0.1),
}

objective_metric_name = 'validation:accuracy'
objective_type = 'Maximize'

metric_definitions = [{'Name': 'validation:accuracy',
                      'Regex': 'val_accuracy: (\S+)'}]

hyperband_config = HyperbandStrategyConfig(max_resource=100, min_resource=10)

tuner = HyperparameterTuner(estimator,
                            objective_metric_name,
                            hyperparameter_ranges,
                            metric_definitions,
                            max_jobs=1,
                            max_parallel_jobs=1,
                            strategy='Hyperband',
                            early_stopping_type='Auto',
                            objective_type=objective_type,
                            strategy_config=hyperband_config)

train_data = sagemaker.inputs.TrainingInput(
    traindataupload,
    distribution="FullyReplicated",
    content_type="text/csv",
    s3_data_type="S3Prefix",
)
validation_data = sagemaker.inputs.TrainingInput(
    valdataupload,
    distribution="FullyReplicated",
    content_type="text/csv",
    s3_data_type="S3Prefix",
)
data_channels = {"train": train_data, "validation": validation_data}

tuner.fit(inputs=data_channels, logs=True)

Expected behavior
Successful launched of Hyperband hyperparameter tuner.

Screenshots or logs

INFO:sagemaker.image_uris:Defaulting to the only supported framework/algorithm version: latest.
INFO:sagemaker.image_uris:Ignoring unnecessary instance type: None.
WARNING:sagemaker.estimator:No finished training job found associated with this estimator. Please make sure this estimator is only used for building workflow config
WARNING:sagemaker.estimator:No finished training job found associated with this estimator. Please make sure this estimator is only used for building workflow config
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
File <timed eval>:1

File /opt/conda/lib/python3.10/site-packages/sagemaker/workflow/pipeline_context.py:272, in runnable_by_pipeline.<locals>.wrapper(*args, **kwargs)
    268         return context
    270     return _StepArguments(retrieve_caller_name(self_instance), run_func, *args, **kwargs)
--> 272 return run_func(*args, **kwargs)

File /opt/conda/lib/python3.10/site-packages/sagemaker/tuner.py:709, in HyperparameterTuner.fit(self, inputs, job_name, include_cls_metadata, estimator_kwargs, wait, **kwargs)
    650 """Start a hyperparameter tuning job.
    651 
    652 Args:
   (...)
    706         arguments are needed.
    707 """
    708 if self.estimator is not None:
--> 709     self._fit_with_estimator(inputs, job_name, include_cls_metadata, **kwargs)
    710 else:
    711     self._fit_with_estimator_dict(inputs, job_name, include_cls_metadata, estimator_kwargs)

File /opt/conda/lib/python3.10/site-packages/sagemaker/tuner.py:720, in HyperparameterTuner._fit_with_estimator(self, inputs, job_name, include_cls_metadata, **kwargs)
    718 self._prepare_estimator_for_tuning(self.estimator, inputs, job_name, **kwargs)
    719 self._prepare_for_tuning(job_name=job_name, include_cls_metadata=include_cls_metadata)
--> 720 self.latest_tuning_job = _TuningJob.start_new(self, inputs)

File /opt/conda/lib/python3.10/site-packages/sagemaker/tuner.py:1751, in _TuningJob.start_new(cls, tuner, inputs)
   1734 """Create a new Amazon SageMaker HyperParameter Tuning job.
   1735 
   1736 The new HyperParameter Tuning job uses the provided `tuner` and `inputs`
   (...)
   1747     information about the started job.
   1748 """
   1749 tuner_args = cls._get_tuner_args(tuner, inputs)
-> 1751 tuner.sagemaker_session.create_tuning_job(**tuner_args)
   1753 return cls(tuner.sagemaker_session, tuner._current_job_name)

File /opt/conda/lib/python3.10/site-packages/sagemaker/session.py:2119, in Session.create_tuning_job(self, job_name, tuning_config, training_config, training_config_list, warm_start_config, tags)
   2114 if training_config is not None and training_config_list is not None:
   2115     raise ValueError(
   2116         "Only one of training_config and training_config_list should be provided."
   2117     )
-> 2119 tune_request = self._get_tuning_request(
   2120     job_name=job_name,
   2121     tuning_config=tuning_config,
   2122     training_config=training_config,
   2123     training_config_list=training_config_list,
   2124     warm_start_config=warm_start_config,
   2125     tags=tags,
   2126 )
   2128 def submit(request):
   2129     LOGGER.info("Creating hyperparameter tuning job with name: %s", job_name)

File /opt/conda/lib/python3.10/site-packages/sagemaker/session.py:2163, in Session._get_tuning_request(self, job_name, tuning_config, training_config, training_config_list, warm_start_config, tags)
   2135 def _get_tuning_request(
   2136     self,
   2137     job_name,
   (...)
   2142     tags=None,
   2143 ):
   2144     """Construct CreateHyperParameterTuningJob request
   2145 
   2146     Args:
   (...)
   2159         dict: A dictionary for CreateHyperParameterTuningJob request
   2160     """
   2161     tune_request = {
   2162         "HyperParameterTuningJobName": job_name,
-> 2163         "HyperParameterTuningJobConfig": self._map_tuning_config(**tuning_config),
   2164     }
   2166     if training_config is not None:
   2167         tune_request["TrainingJobDefinition"] = self._map_training_config(**training_config)

TypeError: Session._map_tuning_config() got an unexpected keyword argument 'strategy_config'

System information
A description of your system. Please provide:

  • SageMaker Python SDK version: 2.116.0
  • Framework name (eg. PyTorch) or algorithm (eg. KMeans): tensorflow-tc-albert-en-base
  • Framework version: *
  • Python version: 3.10
  • CPU or GPU: GPU
  • Custom Docker image (Y/N): N

Additional context
Using the recently merged feature: #3440

@athewsey
Copy link
Collaborator

Also just came across this while trying to use Hyperband strategy with the SDK - using plain Estimator class with XGBoost in algorithm mode.

@auroregosmant
Copy link

auroregosmant commented Feb 1, 2023

Did you find any solution to this?

@yshen92
Copy link
Author

yshen92 commented May 12, 2023

Did you find any solution to this?

Nope. Ended up using Bayesian Optimization instead.

@Sandy4321
Copy link

is it fixed
spec is confusing
https://sagemaker.readthedocs.io/en/stable/api/training/tuner.html
where is discribed how to set up strategy??
they mentioned strategy='Bayesian' or Grid , but where is full description how to choose strategy , seems to be should be used strategy_config (sagemaker.tuner.StrategyConfig) – A configuration for “Hyperparameter” tuning job optimisation strategy

@Sandy4321
Copy link

here mentied strategy="Hyperband"
https://aws.amazon.com/blogs/machine-learning/effectively-solve-distributed-training-convergence-issues-with-amazon-sagemaker-hyperband-automatic-model-tuning/
tuner = HyperparameterTuner(
xgb3,
objective_metric_name,
hyperparameter_ranges,
max_jobs=30,
max_parallel_jobs=4,
strategy="Hyperband",
early_stopping_type="Off",
strategy_config=sc
)

what are all options for strategy and how to use them ?

@Sandy4321
Copy link

for sure in sagemaker source code they do have options values details, do you know how to see sagemker code? may be to install on local comp?

@martinRenou
Copy link
Collaborator

This was fixed by #3516

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

6 participants