Help with Size issue

Hello

My dataset is very large (558,824 entries, 10 total columns) and after performing some operations for my needs, it increases to very large file and am getting errors
Error: cannot allocate vector of size 541.5 Mb

What can be done to take care of size issue? Is there anything I would need to do in RStudio or I would request my IT team to add more storage for me?

Thanks!

This has to do with the amount of RAM in your computer. How much RAM do you have an what kind of computer are you using?

You may also want to post the code that is causing the problem. A dataset of that size should use around 43 MB. Something else is probably being generated.

1 Like

My RAM is 16.0 GB (15.8 GB usable).

Here is a function I wrote that shows me what space objects in my global environment are taking up at any given point in time. It relies on the pryr package.

memtrack <- function() {
require(tidyverse)
require(pryr)
  sort(decreasing = TRUE, sapply(ls(envir = .GlobalEnv), function(x) {
    pryr::object_size(get(x))
  })) %>%
    enframe() %>%
    mutate(
      pcnt = value / sum(value),
      size = paste0(round(value / (1024 * 1024), digits = 2), "Mb")
    )
}

You can use it to see how much of your memory you are using in objects that you've kept around, that you could maybe dispense with to free up space again.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.