Skip to content

Some hyperparameters not being used in Sagemaker Image-classification #310

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
vishav opened this issue Jul 23, 2018 · 1 comment
Closed

Comments

@vishav
Copy link

vishav commented Jul 23, 2018

Please fill out the form below.

System Information

  • Framework (e.g. TensorFlow) / Algorithm (e.g. KMeans):
  • Framework Version:
  • Python Version:
  • CPU or GPU:
  • Python SDK Version:
  • Are you using a custom image:

Describe the problem

num_layers = 18
image_shape = "3,224,224"
num_training_samples = 3988
num_classes = 48
mini_batch_size =  64
epochs = 60
optimizer = 'adam'
learning_rate = 0.1
use_lr_scheduler = 1
lr_scheduler_factor = 0.1
lr_scheduler_step = '10,20,30,40,50,59'
top_k=2
use_pretrained_model = 1
augmentation_type = 'crop_color_transform'
  • I am able to start training without any issue. But when I look at the logs produced, I see following messages:
[07/23/2018 18:40:34 INFO 140330486703936] Reading provided configuration from /opt/ml/input/config/hyperparameters.json: {u'learning_rate': u'0.1', u'use_pretrained_model': u'1', u'epochs': u'60', u'num_training_samples': u'3988', u'num_layers': u'18', u'mini_batch_size': u'64', u'image_shape': u'3,224,224', u'num_classes': u'48'}
[07/23/2018 18:40:34 INFO 140330486703936] lr_scheduler_step defined without lr_scheduler_factor, will be ignored...
[07/23/2018 18:40:34 INFO 140330486703936] augmentation_type: None
[07/23/2018 18:40:34 INFO 140330486703936] checkpoint_frequency: 60

Why sagemaker is not using the optimizer, lr_scheduler_factor, augmentation_type that I have mentioned?

@ssaini4
Copy link

ssaini4 commented Jul 25, 2018

All the hyperparameters that are passed into the algorithm will be used for training. According to the logs, it seems you set the values for all these hyperparameters, but didn't pass them into the "training_params" session located in the Training cell. To use the hyperparameters you listed, you can use the code snippet like this:

"HyperParameters": {
"num_layers": str(num_layers),
"image_shape": image_shape,
"num_training_samples": str(num_training_samples),
"num_classes": str(num_classes),
"mini_batch_size": str(mini_batch_size),
"epochs": str(epochs),
"optimizer": optimizer,
"learning_rate": str(learning_rate),
"lr_scheduler_step": lr_scheduler_step,
"lr_scheduler_factor": str(lr_scheduler_factor),
"top_k": str(top_k),
"use_pretrained_model": str(use_pretrained_model),
"augmentation_type": augmentation_type
},

Also, please notice that the Amazon SageMaker image classification does not support hyperparameter "use_lr_scheduler".

@vishav vishav closed this as completed Jul 29, 2018
apacker pushed a commit to apacker/sagemaker-python-sdk that referenced this issue Nov 15, 2018
Add DeepAR Electricity Forecasting notebook.
knakad pushed a commit to knakad/sagemaker-python-sdk that referenced this issue Mar 31, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants