You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This causes issues when passing urls or other long strings via the environment parameter, as they often exceed the limit. I am not aware of an alternative to pass items that shouldn't be written to files to a job at runtime. Storing some info on s3 and treating it as an input cannot be used as a workaround, as some variables should not be shared with everyone who has access to the s3 bucket nor stored long term. It's also inconvenient to write a notebook/user's env vars to s3, read them in, set them as env vars etc.
Expected behavior
Ability to pass longer environment variables to jobs.
Screenshots or logs
botocore.exceptions.ClientError: An error occurred (ValidationException) when calling the CreateProcessingJob operation: 1 validation error detected: Value '<CENSORED>' at 'environment' failed to satisfy constraint: Map value must satisfy constraint: [Member must have length less than or equal to 256, Member must have length greater than or equal to 0, Member must satisfy regular expression pattern: [\S\s]*]
System information
A description of your system. Please provide:
SageMaker Python SDK version: 2.52.0
Framework name (eg. PyTorch) or algorithm (eg. KMeans): N/A
Framework version: N/A
Python version: 3.9
CPU or GPU: CPU
Custom Docker image (Y/N): Both
Additional context
I suspect others may be facing the same issue as there is mention of passing custom urls for pip configurations.
Yeah this limit des not work very well with their other services.
These pipelines are regularly added to part of step functions.
I am trying to pass in the Step Function Task Token (which goes up to 1024 length) to our post-processing step of our Sagemaker transform pipeline and hitting this wall.
Describe the bug
Currently, the value length constraints for environment variables are quite limited in
CreateProcessingJob
andCreateTrainingJob
.This causes issues when passing urls or other long strings via the environment parameter, as they often exceed the limit. I am not aware of an alternative to pass items that shouldn't be written to files to a job at runtime. Storing some info on s3 and treating it as an input cannot be used as a workaround, as some variables should not be shared with everyone who has access to the s3 bucket nor stored long term. It's also inconvenient to write a notebook/user's env vars to s3, read them in, set them as env vars etc.
It looks like it would be possible to have an api call accept keys that are over 256 characters in length. Example https://docs.aws.amazon.com/sagemaker/latest/APIReference/API_CreateTrainingJob.html#sagemaker-CreateTrainingJob-request-HyperParameters accepts 2500 characters.
To reproduce
Pass a string over the char limit similar to below.
Expected behavior
Ability to pass longer environment variables to jobs.
Screenshots or logs
botocore.exceptions.ClientError: An error occurred (ValidationException) when calling the CreateProcessingJob operation: 1 validation error detected: Value '<CENSORED>' at 'environment' failed to satisfy constraint: Map value must satisfy constraint: [Member must have length less than or equal to 256, Member must have length greater than or equal to 0, Member must satisfy regular expression pattern: [\S\s]*]
System information
A description of your system. Please provide:
Additional context
I suspect others may be facing the same issue as there is mention of passing custom urls for pip configurations.
#2207 (comment)
The text was updated successfully, but these errors were encountered: