Skip to content

change: add s3_analysis_config_output_path field in DataConfig constructor #2698

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Oct 20, 2021

Conversation

qidewenwhen
Copy link
Member

@qidewenwhen qidewenwhen commented Oct 12, 2021

Issue #, if available: N/A

Description of changes:
The change is made to add s3_analysis_config_output_path field in DataConfig constructor so that we can have different output path in S3 for analysis_config and processing job output.
This change can benefit the implementation of Sagemaker Pipeline's Clarify step as a SageMaker Pipeline step may use the DataConfig API. The step may put placeholders (like ExecutionId or StepName) to s3_output_path and the placeholders can not be resolved when the API is called, but the API needs a real S3 path to upload the analysis config file. This is why the new parameter s3_analysis_config_output_path is added to accept the real S3 path.
Besides, this change is backward compatible since if the s3_analysis_config_output_path is not supplied, the s3_output_path will be used for the analysis_config output as previous did.

Testing done: Added a unit test

Merge Checklist

Put an x in the boxes that apply. You can also fill these out after creating the PR. If you're unsure about any of them, don't hesitate to ask. We're here to help! This is simply a reminder of what we are going to look for before merging your pull request.

General

  • I have read the CONTRIBUTING doc
  • I certify that the changes I am introducing will be backword compatible, and I have discussed concerns about this, if any, with the Python SDK team
  • I used the commit message format described in CONTRIBUTING
  • I have passed the region in to all S3 and STS clients that I've initialized as part of this change.
  • I have updated any necessary documentation, including READMEs and API docs (if appropriate)

Tests

  • I have added tests that prove my fix is effective or that my feature works (if appropriate)
  • I have added unit and/or integration tests as appropriate to ensure backward compatibility of the changes
  • I have checked that my tests are not configured for a specific region or account (if appropriate)
  • I have used unique_name_from_base to create resource names in integ tests (if appropriate)

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

@sagemaker-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: sagemaker-python-sdk-slow-tests
  • Commit ID: 8cb72df
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@xgchena
Copy link
Contributor

xgchena commented Oct 14, 2021

For tracking purpose: per offline chat with Dewen, a SageMaker Pipeline step will use the DataConfig API. The step may put placeholders (like ExecutionId or StepName) to s3_output_path and the placeholders can not be resolved when the API is called, but the API needs a real S3 path to upload the analysis config file, that is why the new parameter s3_analysis_config_output_path is added to accept the real S3 path.

@sagemaker-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: sagemaker-python-sdk-unit-tests
  • Commit ID: 8786ef3
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: sagemaker-python-sdk-local-mode-tests
  • Commit ID: 8786ef3
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: sagemaker-python-sdk-unit-tests
  • Commit ID: 8f60d20
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: sagemaker-python-sdk-slow-tests
  • Commit ID: 8f60d20
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: sagemaker-python-sdk-local-mode-tests
  • Commit ID: 8f60d20
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: sagemaker-python-sdk-unit-tests
  • Commit ID: ffcfcc0
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: sagemaker-python-sdk-slow-tests
  • Commit ID: ffcfcc0
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@sagemaker-bot
Copy link
Collaborator

AWS CodeBuild CI Report

  • CodeBuild project: sagemaker-python-sdk-local-mode-tests
  • Commit ID: ffcfcc0
  • Result: SUCCEEDED
  • Build Logs (available for 30 days)

Powered by github-codebuild-logs, available on the AWS Serverless Application Repository

@ahsan-z-khan ahsan-z-khan merged commit 04824f7 into aws:master Oct 20, 2021
EthanShouhanCheng pushed a commit to SissiChenxy/sagemaker-python-sdk that referenced this pull request Jan 11, 2022
…uctor (aws#2698)

Co-authored-by: Dewen Qi <[email protected]>
Co-authored-by: Shreya Pandit <[email protected]>
Co-authored-by: Navin Soni <[email protected]>
Co-authored-by: Ahsan Khan <[email protected]>
@maslick
Copy link

maslick commented May 23, 2023

It looks it's not possible to parameterize s3_analysis_config_output_path using Pipeline parameters:

model_explainability_data_config = DataConfig(
    s3_data_input_path=step_process.properties.ProcessingOutputConfig.Outputs[
        "shap"
    ].S3Output.S3Uri,
    s3_output_path=ParameterString(name="s3_output_path", default_value="s3://helloworld/"),
    s3_analysis_config_output_path=ParameterString(name="s3_analysis_config_output_path", default_value="s3://helloworld/analasys_config"),
    label='target',
    dataset_type="text/csv",
)
Exception: s3_analysis_config_output_path cannot be of type ExecutionVariable/Expression/Parameter/Properties

Nor it is possible to leave s3_analysis_config_output_path as default value i.e. None:

model_explainability_data_config = DataConfig(
    s3_data_input_path=step_process.properties.ProcessingOutputConfig.Outputs[
        "shap"
    ].S3Output.S3Uri,
    s3_output_path=ParameterString(name="s3_output_path", default_value="s3://helloworld/"),
    label='target',
    dataset_type="text/csv",
)
Exception: `s3_output_path` cannot be of type ExecutionVariable/Expression/Parameter/Properties if `s3_analysis_config_output_path` is none or empty

The documenation states:

If this field is None, then the s3_output_path will be used to store the analysis_config output.:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants