I think I've seen that kind of things with functions that have a lot of internal computations. While running, the function creates several intermediate objects in memory, until it runs out. So the last object, that fails to create, may be only 242 MB, but before that R just created 2.39 GB worth of other objects.
If you have a way to monitor the free RAM (e.g. Window's task manager with a fast refresh speed), what I typically see is:
- I start the function
- for several seconds (or minutes) my computer works hard, the fans might turn on, and I see the used RAM increase steadily
- the function runs the one-too-many operation, and stops with the error "can't allocate vector of size xx MB", the RAM is immediately freed to the level it was when I started the function and the fans stop blowing.