Skip to content

Parameter Secrets - too many open files #4003

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
tb102122 opened this issue Mar 21, 2024 · 4 comments
Closed

Parameter Secrets - too many open files #4003

tb102122 opened this issue Mar 21, 2024 · 4 comments
Assignees

Comments

@tb102122
Copy link

Expected Behaviour

Return of Secret details

Current Behaviour

"errorMessage": "SSL validation failed for https://secretsmanager.eu-central-1.amazonaws.com/ [Errno 24] Too many open files",
"errorType": "GetParameterError",
"stackTrace": [
" File "/var/task/xxx.py", line 70, in lambda_handler\n secret_details = ssm_provider.get(secretName)\n",
" File "/opt/python/aws_lambda_powertools/utilities/parameters/base.py", line 139, in get\n raise GetParameterError(str(exc))\n"
]
}

Code snippet

from aws_lambda_powertools.utilities import parameters
secrets_mgn = boto3.client("secretsmanager")
ssm_provider = parameters.SecretsProvider(boto3_client=secrets_mgn)
secretName = "XXX"
secret_details = ssm_provider.get(secretName)
secret_details = json.loads(secret_details )

Possible Solution

No response

Steps to Reproduce

I am not sure how to reproduce it. I assume it was due to several quick invokes of the same lambda function.

Powertools for AWS Lambda (Python) version

latest

AWS Lambda function runtime

3.11

Packaging format used

Lambda Layers

Debugging logs

No response

@tb102122 tb102122 added bug Something isn't working triage Pending triage from maintainers labels Mar 21, 2024
@leandrodamascena
Copy link
Contributor

Hey @tb102122! Reproducing the error here can be quite challenging, especially considering that there you may have
other part of the code affecting the Lambda file descriptors limit.

AWS Lambda imposes certain hard limits when running code, one of which is the File Descriptors limit, which serves as a unique identifier for files or any other I/O resource/operation.

Are you opening/saving many files? Are you iterating something and creating a new boto3 instance in every iteration? Any other I/O operation?

I'm afraid I can't assist you here, as this error wasn't generated by Powertools, but rather by botocore when making the HTTP request to the AWS endpoint 😞.

@leandrodamascena leandrodamascena added not-a-bug and removed bug Something isn't working triage Pending triage from maintainers labels Mar 22, 2024
@leandrodamascena leandrodamascena self-assigned this Mar 22, 2024
@tb102122
Copy link
Author

Thanks for the explanation.
I do some batch uploads of files to s3 but that comes in a later step of the code.
So I don’t expect that the File Descriptors limit is reached also the amount of files I upload is around 50.

@leandrodamascena
Copy link
Contributor

Maybe reading files before calling parameters?! Or some other operation that could impact the file descriptor.
A good way to debug would be to comment out parts of the code and check where the error occurs after several Lambda calls.
I will close this issue and reopen if you still have any questions or if we can help.

Thanks

Copy link
Contributor

⚠️COMMENT VISIBILITY WARNING⚠️

This issue is now closed. Please be mindful that future comments are hard for our team to see.

If you need more assistance, please either tag a team member or open a new issue that references this one.

If you wish to keep having a conversation with other community members under this issue feel free to do so.

@leandrodamascena leandrodamascena moved this from Coming soon to Shipped in Powertools for AWS Lambda (Python) Mar 28, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Shipped
Development

No branches or pull requests

2 participants