-
Notifications
You must be signed in to change notification settings - Fork 3.3k
High Memory usage in Library when used inside a Pod #836
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Which version of the library do you use? It's important because starting from 9.0.0 there are some memory optimizations. |
Hey @tomplus I am using 9.0.0 . |
Issues go stale after 90d of inactivity. If this issue is safe to close now please do so with Send feedback to sig-testing, kubernetes/test-infra and/or fejta. |
Stale issues rot after 30d of inactivity. If this issue is safe to close now please do so with Send feedback to sig-testing, kubernetes/test-infra and/or fejta. |
Rotten issues close after 30d of inactivity. Send feedback to sig-testing, kubernetes/test-infra and/or fejta. |
@fejta-bot: Closing this issue. In response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. |
/reopen |
@CastixGitHub: You can't reopen an issue/PR unless you authored it or you are a collaborator. In response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. |
I have incorporated the k8s python client library in a monitoring application , which checks for connections between the other pods of same app and exposes the metrics out to Prometheus Server to scrape .
Following is the function that does this
resources:
limits:
memory: 128Mi
requests:
memory: 128Mi
After using the Python Client the memory usage pattern is
Please help me with the issue
The text was updated successfully, but these errors were encountered: