Rstudio taking up all the RAM and more for simple tasks

Hi,

I have deleted and reinstalled the latest versions of R (3.4) and Rstudio for Mac OS X from https://cran.r-project.org/bin/macosx/ and https://www.rstudio.com/products/rstudio/download/, respectively. I checked that R works properly in the Terminal by attaching a package. I opened Rstudio and cleared the environment using the broom. I then typed print(3) in the console. The result did not appear. Instead the memory used by Rstudio (as seen in my Activity Monitor) goes up and up, up to sometimes 40 GB (with physical memory being 16 GB). The CPU usage of Rstudio is 100%. My laptop ventilates a lot. Rstudio stops responding and the only thing left to do is to Force Quit. I have reproduced this problem dozens of times today and would love some help. I also noticed this kernel_task using 100 % of a CPU whenever Rstudio is using 100 % of another CPU. Note that I have 4 cores. After force quitting, my activity monitor does not show high CPU and memory usage and my laptop stops ventilating.

Thanks for the bug report! We'll need some more diagnostic information in order to get to the bottom of this. Could you try the following:

  1. Close any other instances of RStudio open on your machine,
  2. Launch a new instance of RStudio,
  3. Get RStudio into the state where it's using 100% of your CPU.

Then, you can open a terminal (e.g. Ctrl + Space, type Terminal.app, and open that), then execute:

sample rsession > rsession-sample.txt

Then, you can send the generated rsession-sample.txt file over to us for analysis (e.g. share a link to that file in DropBox, or perhaps upload it to https://gist.github.com)

1 Like

Thanks Kevin. This time I did not even need to try to write anything in the console, Rstudio went straight into swallowing my computer's ressources and then not responding (in red in the activity monitor). I put the file on my personal website at: http://cristian-riccio.ch/wp-content/uploads/2018/03/rsession-sample.txt

It looks like the rsession itself is not actually busy. Do you see the 'beach ball' when RStudio freezes up like this? Can you interact with other parts of the IDE even if the prompt is unresponsive (e.g. can you type in an open editor)?

Can you try running the same steps as before, but with profiling the RStudio executable instead?

sample RStudio > rstudio-sample.txt

You might also try and see if the advice in https://support.rstudio.com/hc/en-us/articles/200534577-Resetting-RStudio-Desktop-s-State helps you get out of this state.

1 Like

The problem has magically disappeared...

I'm not sure whether to call that fortunate or not. :slight_smile:

In any case, if the problem returns please let us know!

@kevinushey Hi! I am facing the same issue. R studio is consuming 6gb out of 16 b of RAM for execution of Linear regression. I uninstalled R 3.5 and installed 3.4.4 for windows 10, but the same issue persists. I tried to open the terminal to generate the report you stated above, but I am unable to navigate to other parts of IDE when the function is executed. I need to manually kill the process from the Task manager to use R studio again. I have uploaded the log file after i force closed R studio.

Link to the log file: https://www.dropbox.com/s/d5nkhhuq5oensjx/rsession-rohan.log?dl=0

It's hard to say without seeing your model / code / data, but is it possible that R itself is running out of memory while attempting to fit your model?

Below is the code:

suppressPackageStartupMessages(library(tidyverse))
suppressPackageStartupMessages(library(stats))
suppressPackageStartupMessages(library(mice))
Train=read.csv("fTrain.csv",header = TRUE)
str(Train)
Train$Gas.Oxygen=as.numeric(as.character(Train$Gas.Oxygen))
Train$Liquid.Nitrogen=as.numeric(as.character(Train$Liquid.Nitrogen))
Train$Liquid.Oxygen=as.numeric(as.character(Train$Liquid.Oxygen))
Train$Liquid.Argon=as.numeric(as.character(Train$Liquid.Argon))
Train$Cooling.Water=as.numeric(as.character(Train$Cooling.Water))
Train$Power=as.numeric(as.character(Train$Power))
hist(Train$Power)
tail(md.pattern(Train),1)
imp=mice(data = Train)
imp
Model= lm(Power ~. ,data = Train)
car::avPlots(Model)

The program stops at mice function.The data set has 9000 observations and 7 variables. I am not sure if R is running out of memory. Every time i try to execute the above code, I see 6gb of ram being consumed.

I suspect the issue is indeed that mice is consuming all available memory, and is either freezing up or just taking a very long time to impute.

Do you see the same behavior if you try to fit this model within a plain R session run at the terminal, rather than within RStudio?

I am not aware how to run the model on the Terminal. I did try using online help for executing the model on Terminal but no luck. Could you please help me on what to do ?

I had the same problem. What I did was to restart my r session and it worked!