Skip to content

Make code_location to be S3 URI instead of bucket in training_config() #501

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Nov 20, 2018
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ CHANGELOG

* feature: Estimators: dependencies attribute allows export of additional libraries into the container
* feature: Add APIs to export Airflow transform and deploy config
* bug-fix: Allow code_location argument to be S3 URI in training_config API

1.15.0
======
Expand Down
3 changes: 2 additions & 1 deletion src/sagemaker/estimator.py
Original file line number Diff line number Diff line change
Expand Up @@ -670,8 +670,9 @@ def __init__(self, entry_point, source_dir=None, hyperparameters=None, enable_cl
training jobs. This will be ignored for now and removed in a further release.
container_log_level (int): Log level to use within the container (default: logging.INFO).
Valid values are defined in the Python logging module.
code_location (str): Name of the S3 bucket where custom code is uploaded (default: None).
code_location (str): The S3 URI where custom code is uploaded (default: None).
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's fix this to make it clear we expect a prefix and will append /source/sourcedir.tar.gz

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated.

If not specified, default bucket created by ``sagemaker.session.Session`` is used.
The default S3 path is default_bucket/job-name/source/.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

including s3 protocol makes this clearer (s3://default_bucket/...)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated.

image_name (str): An alternate image name to use instead of the official Sagemaker image
for the framework. This is useful to run one of the Sagemaker supported frameworks
with an image containing custom dependencies.
Expand Down
8 changes: 6 additions & 2 deletions src/sagemaker/workflow/airflow.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,12 @@ def prepare_framework(estimator, s3_operations):
estimator (sagemaker.estimator.Estimator): The framework estimator to get information from and update.
s3_operations (dict): The dict to specify s3 operations (upload `source_dir`).
"""
bucket = estimator.code_location if estimator.code_location else estimator.sagemaker_session._default_bucket
key = '{}/source/sourcedir.tar.gz'.format(estimator._current_job_name)
if estimator.code_location is not None:
bucket, key = fw_utils.parse_s3_url(estimator.code_location)
key = os.path.join(key, 'source', 'sourcedir.tar.gz')
else:
bucket = estimator.sagemaker_session._default_bucket
key = os.path.join(estimator._current_job_name, 'source', 'sourcedir.tar.gz')
script = os.path.basename(estimator.entry_point)
if estimator.source_dir and estimator.source_dir.lower().startswith('s3://'):
code_dir = estimator.source_dir
Expand Down
9 changes: 3 additions & 6 deletions tests/unit/test_airflow.py
Original file line number Diff line number Diff line change
Expand Up @@ -244,7 +244,7 @@ def test_framework_training_config_all_args(sagemaker_session):
source_dir="{{ source_dir }}",
enable_cloudwatch_metrics=False,
container_log_level="{{ log_level }}",
code_location="{{ bucket_name }}",
code_location="s3://{{ bucket_name }}/{{ prefix }}",
training_steps=1000,
evaluation_steps=100,
checkpoint_path="{{ checkpoint_path }}",
Expand Down Expand Up @@ -304,9 +304,7 @@ def test_framework_training_config_all_args(sagemaker_session):
'SecurityGroupIds': ['{{ security_group_ids }}']
},
'HyperParameters': {
'sagemaker_submit_directory': '"s3://{{ bucket_name }}/{{ base_job_name }}-'
'{{ execution_date.strftime(\'%Y-%m-%d-%H-%M-%S\') }}'
'/source/sourcedir.tar.gz"',
'sagemaker_submit_directory': '"s3://{{ bucket_name }}/{{ prefix }}/source/sourcedir.tar.gz"',
'sagemaker_program': '"{{ entry_point }}"',
'sagemaker_enable_cloudwatch_metrics': 'false',
'sagemaker_container_log_level': '"{{ log_level }}"',
Expand All @@ -322,8 +320,7 @@ def test_framework_training_config_all_args(sagemaker_session):
'S3Upload': [{
'Path': '{{ source_dir }}',
'Bucket': '{{ bucket_name }}',
'Key': "{{ base_job_name }}-{{ execution_date.strftime('%Y-%m-%d-%H-%M-%S') }}"
"/source/sourcedir.tar.gz",
'Key': "{{ prefix }}/source/sourcedir.tar.gz",
'Tar': True}]
}
}
Expand Down