vector memory exhausted very common in Mac M1

Hi I am running into some annoying issue. I noticed that the "vector memory exhausted (limit reached?)" is very common when running R and R studio in my Macbook pro M1 with 16G RAM. In my case, this is happening whenever I try to load data and environments that are larger than ~4GB. This is quite surprising to me given that my older MacBook air, with only 8G RAM could load bigger sizes than this, and this issue makes it very hard for me to work on the M1 given that many of my projects are typically >4G.

Any suggestions on how to handle this? Very much appreciated.

Thank you!

Hi,

Welcome to the RStudio community!

Take a look at an older, similar post on this forum and see if that helps

Good luck,
PJ

Hi PJ, thanks for sharing this, I had already seen this and other suggestions for similar problems. The suggested solution doesn't really work. I have experienced this before on my older MacBook and was able to fix it using the suggested steps, for my old MAC. But for the M1, this happens quite often for relatively small datasets. with my old mac I wouldn't encounter this until i started hitting close to 8 to 10GB on the size if the dataset, ut for the M1 this starts happening at around 4 GB and can't seem to solve it whatever I do

Hi,

That's too bad! Unfortunately I'm not a MacOS user and have no experience with this, so let's hope another forum member knows a way around this issue!

PJ

How are you quantifying the size of the data?
Perhaps the data compressed on disk is one size but uncompressed and in memory is larger...

R by default wants to hold objects in RAM , special packages might be used to process parts of a dataset in chunks... for example disk.frame

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.