Skip to content

fix: Normalizing job_name in the ProcessingStep. #2786

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions src/sagemaker/workflow/steps.py
Original file line number Diff line number Diff line change
Expand Up @@ -496,6 +496,7 @@ def __init__(
self.job_arguments = job_arguments
self.code = code
self.property_files = property_files
self.job_name = name
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we use the step name as the job name then all input and output locations will be overwritten when the args are normalized. In practical terms that means any pipeline with a step named MyProcessingStep will start using the code for this processor.

Things get a little better if we include the pipeline_name in the job_name, but that still means the code will be overwritten which will prevent recreatibility (ie, looking at a previous execution and going to S3 to see the script that was run).

Let me give this some thought. We need some signal of intentionality from the user here to maintain backwards compatibility.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the best approach here is to hash the caller's script contents and include that in the job_name. This ensures there will be cache hits/misses as appropriate while not requiring any input from the user. I can work on this change since it's a little heavier weight than initially expected

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good callout! How do you feel about using the enable caching boolean as the signal of intentionality from the user? Something like:

name = pipeline_name + job_name
self.job_name = name if cache_config.enable_caching else None

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your comment loaded later for me - I like your idea as well. This will go against the documentation here

Pipelines doesn't check whether the data or code that the arguments point to has changed

As a user though, I do believe it will be nice to evolve caching pipeline steps functionality to detect data/code changes!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#2790

I'm going to close this PR


# Examine why run method in sagemaker.processing.Processor mutates the processor instance
# by setting the instance's arguments attribute. Refactor Processor.run, if possible.
Expand All @@ -514,6 +515,7 @@ def arguments(self) -> RequestType:
ProcessingJobName and ExperimentConfig cannot be included in the arguments.
"""
normalized_inputs, normalized_outputs = self.processor._normalize_args(
job_name=self.job_name,
arguments=self.job_arguments,
inputs=self.inputs,
outputs=self.outputs,
Expand Down
1 change: 1 addition & 0 deletions tests/unit/sagemaker/workflow/test_steps.py
Original file line number Diff line number Diff line change
Expand Up @@ -520,6 +520,7 @@ def test_processing_step_normalizes_args(mock_normalize_args, sagemaker_session)
mock_normalize_args.return_value = [step.inputs, step.outputs]
step.to_request()
mock_normalize_args.assert_called_with(
job_name=step.name,
arguments=step.job_arguments,
inputs=step.inputs,
outputs=step.outputs,
Expand Down