Rstudio and R memory allocation

I am very new to machine learning (caret) and I am having trouble with RAM allocation in Rstudio, but not R.

I am trying to run a Random forest model using cross validation. After preparing my train data.table (149717 obs with a size of 5392816 bytes) I get a error message:

Error: cannot allocate vector of size 1.1 Gb

Meaning that I don't have enough RAM to load the object. So I decreased my train data.table systematically and eventually, with only 5% of the original training data, the model works in Rstudio without producing this error.

However, this is not ideal, and i need to run my model on a larger sample than 5% of the original size. I decided to test this in R (not Rstudio) and my model works just fine with no memory issues using the full training data.table.

When looking at my task manager during R and Rstudio processing, it seems that R is way more efficient with memory usage than Rstudio (Rstudio used 100% of RAM while R used 65% while ruing the SAME test with the SAME data!).

When I use the memory.limit() function in Rstudio i get this output: 1.759219e+13. And then I run the same function in R I get this output: 16291

Laptop specs: 256 GB SSD. 16 GB RAM. Windows 10 Pro 64bit. I7 processor with 8 logical cores.

My question is: Why is my memory allocated so differently in R vs Rstudio? The memory limit for R is fine and uses all my RAM I have, but this is not the case with Rstudio. Is this a bug or is my memory allocation just incorrectly set up within Rstudio? If the latter, can someone please advise how to set up my Rstudio memory allocation.

Also, any other ideas what can be causing this issue?

This is not a complete answer (sorry! :frowning: ) but the wacky memory.limit() result is a known bug, with work underway.

Some previous discussion here:

And the Github issue is here:

Thanks for your response. Waiting for the new Rstudio release.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.