Skip to content

Can't use a large cache size limit #188

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
brookslogan opened this issue Sep 28, 2023 · 5 comments · Fixed by #189
Closed

Can't use a large cache size limit #188

brookslogan opened this issue Sep 28, 2023 · 5 comments · Fixed by #189

Comments

@brookslogan
Copy link
Contributor

epidatr::set_cache(max_size = 10000L)
#> Warning in cachem::cache_disk(dir = cache_dir, max_size = as.integer(max_size *
#> : NAs introduced by coercion to integer range

Created on 2023-09-28 with reprex v2.0.2

I believe R always uses 32-bit integers, so 10000L * 1024^2 is too large.

@brookslogan brookslogan changed the title Can't use a large cache Can't use a large cache size limit Sep 28, 2023
@dshemetov dshemetov added this to the 1.2 release milestone Sep 29, 2023
@dsweber2
Copy link
Contributor

So... that's 10,000GB. I'm not sure we have 10TB worth of data, and caching that is... not worth it. XD

Was it not clear that max_size is expressed in GB?

@brookslogan
Copy link
Contributor Author

No, it's in MB?

max_size: the size of the entire cache, in MB, at which to start
          pruning entries. By default this is '1024', or 1GB. The
          environmental variable is 'EPIDATR_CACHE_MAX_SIZE_MB'.

@brookslogan brookslogan reopened this Oct 3, 2023
@dsweber2
Copy link
Contributor

dsweber2 commented Oct 3, 2023

You're right, not sure what I was thinking at the time. The PR Dmitry made should fix the error you were running into though.

@brookslogan
Copy link
Contributor Author

brookslogan commented Oct 3, 2023

Is it merged already? [i.e. can we just close this again?]

@brookslogan
Copy link
Contributor Author

Looks like it. This is indeed closed in #189. Somehow read over this and thought it was closed as not planned.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants