Interestingly enough, in R, memory.limit(size=) does not allow for size beyond 4000MB, where in RStudio, memory.limit(size=) could be set to any limit.

Maybe you can workaround by binning/summarising the data before you plot it.

Do you have 64 bit version of RStudio and Windows?

https://www.rdocumentation.org/packages/utils/versions/3.6.1/topics/memory.size

I wonder if the Microsoft version of R, which I understand is fully 64 bit, would help.

R and RStudio memory usage documentation is surprisingly awful. Apparently you need to set a command line parameter but I can't find how to do this?

I did some experiments. I have 24Gb physical RAM, Windows 10 64 bit. It's instructive to open the Windows Task Manager and watch the memory usage as you do this. It looks like I can use most of my physical memory without changing any settings or command line arguments. If I push it even further R/RStudio seems to start using the disk to store stuff, or just says no.

# https://www.rdocumentation.org/packages/utils/versions/3.6.1/topics/memory.size

Sys.info()
#>        sysname        release        version       nodename        machine 
#>      "Windows"       "10 x64"  "build 18362"      "DNZ2001"       "x86-64" 
#>          login           user effective_user 
#>    "WoodwardS"    "WoodwardS"    "WoodwardS"

memory.size()
#> [1] 46.31
memory.size(TRUE)
#> [1] 48.94
memory.limit()
#> [1] 24460

x <- data.frame(x = runif(1000000000*2))

memory.size()
#> [1] 15304.98
memory.size(TRUE)
#> [1] 15307.75
memory.limit()
#> [1] 24460

rm(x) # remove object
gc() # garbage collection
#>          used (Mb) gc trigger    (Mb)   max used    (Mb)
#> Ncells 511114 27.3    1145652    61.2     628574    33.6
#> Vcells 982837  7.5 1922617571 14668.5 2001035795 15266.7

memory.size() 
#> [1] 46.17
memory.size(TRUE)
#> [1] 15307.75
memory.limit()
#> [1] 24460

Created on 2019-10-31 by the reprex package (v0.3.0)

Again, please consider whether this is a ulimit issue. The IBM post I linked to shows how to check easily, from

SFU korn shell ... [do] a "ulimit -a" command

which looks like this on Mojave

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
file size               (blocks, -f) unlimited
max locked memory       (kbytes, -l) unlimited
max memory size         (kbytes, -m) unlimited
open files                      (-n) 4864
pipe size            (512 bytes, -p) 1
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 709
virtual memory          (kbytes, -v) unlimited

My computer does not have Services for Unix, thus I'm not able to carry out the rest of the steps. I think it has to be installed through a CD-Rom, which I do not have. Thanks for the help though!

So I tried to plot the graph again with Task Manager opened. Memory usage for RStudio during the analysis is low to moderate. After around 15 seconds I get Error: cannot allocate vector size 4GB. It seems like RStudio isn't even trying to obtain more memory from Windows to generate the plot.

Would my best luck be to simply find a more powerful computer?

The command line is just a quick way to check. The IBM link tells how to change ulimit in the registry, assuming that you have the requisite privileges and are comfortable with registry editing. All the additional RAM that can fit in your computer won't help if the operating system won't allow you to use more than a pre-set limit.

What do you get when you run the short memory.size script I provided above? You will need to change the size of x to fit your memory.

Sys.info()
#> sysname release version nodename machine login user
#> "Windows" "10 x64" "build 18362" "DESKTOP-D53OR22" "x86-64" "samsk" "samsk"
#> effective_user
"samsk"
memory.size()
#> [1] 1508.41
memory.size(T)
#> [1] 1525.69
memory.limit()
#> [1] 8067
x <- data.frame(x = runif(1000000000*2))
Error: cannot allocate vector of size 14.9 Gb
memory.limit(size = 100000)
#> [1] 1e+05 ### This must be an error as I don't have that much RAM.
memory.limit()
#> [1] 1e+05
memory.size()
#> [1] 1512.69 ### No difference.

I'm not familiar with the registry. The IBM link says to increase the "maxOpenFiles" by navigating to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Services for Unix. However, my computer does not have "Services for Unix" under "Microsoft". How should I proceed? Which command line are you referring to?

1 Like

Turns out "Services for Unix" seems to be a discontinued service of Windows.
en.wikipedia.org/wiki/Windows_Services_for_UNIX

Any other suggestions on how to proceed?

1 Like

Thanks for the added detail. ulimit isn't something that can be set in R or RStudio, it comes from the operating system. I'm sorry I gave you an out-of-date-fix. Window's UNIX-like are now accessible through Powershell.

If your case if you have only 8GB of RAM, increasing ulimit beyond 8067 will have no effect; your objects need to be able to fit in RAM and they obviously can't if you don't have enough.

The best alternative if adding memory to, say, 32GB isn't an option is to rent time on a heavy duty AWS EC2 instance using one of the provisioned R images that I've heard of, but I have no personal experience with that.

1 Like

Try
x <- data.frame(x = runif(1000000000*0.5))
This will show if R is using your available 8Gb correctly.

Thanks for the info. I'll look into cloud computing or simply add more ram to my computer. Thanks again!

I am able to run this code. However, I believe that my computer simply does not have enough memory to run the analysis. I may purchase additional ram or take Richard's advice and look into cloud computing. Thanks for all the help!

I am not sure where you are at on this issue but know that there is bug regarding memory.limit, R 3.6 and RStudio IDE that has been resolved.

See

And it is available in the preview release
See release Note

I really don't know if it is related (seems a bit different) but it worth a try just in case, and it is good to reference this here I guess.

3 Likes

Fix incorrect memory.limit() result with R >= 3.6.0 (#4986)

2 Likes

It's been a while but I finally got it done.
Amazon cloud service's virtual computers did the trick. Turns out I needed at least 80Gbs of RAM. Thanks for the help!

2 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.