Failure to allocate vector with plenty of RAM available

Hi

I am working with genomic data which is quite large. Somtimes this is too large to handle, but here I am seeing some strange behaviour. I get the message failure to allocate vector of size 242 MB which is strange, because If I check Task Manager I can see that I have around 2.4 GB worth of free RAM available.

Does anybody have an idea why this is happening?

If you are on Linux or macOS, enter unlimit, which shows how much of free RAM is available. By default, it is usually around 8GB, but might have been set lower. See man ulimit for how to adjust.

For Windows, I don’t know.

I think it could be related to memory fragmentation , i.e you may have a total of 2.4gb free , but its a collection of smaller regions of which no one alone is big enough to contain the new vector

I think I've seen that kind of things with functions that have a lot of internal computations. While running, the function creates several intermediate objects in memory, until it runs out. So the last object, that fails to create, may be only 242 MB, but before that R just created 2.39 GB worth of other objects.

If you have a way to monitor the free RAM (e.g. Window's task manager with a fast refresh speed), what I typically see is:

  • I start the function
  • for several seconds (or minutes) my computer works hard, the fans might turn on, and I see the used RAM increase steadily
  • the function runs the one-too-many operation, and stops with the error "can't allocate vector of size xx MB", the RAM is immediately freed to the level it was when I started the function and the fans stop blowing.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.