RStudio project crashing during ML model training

I'm running a R script in my RStudio Cloud Project ( that trains a ML model with the caret package. In the past, it typically took < 7 hours for the model to train and it would work just fine.

Recently (in the last 24 hours at the time of this post), I've run the R script twice and each time the RStudio Cloud project has crashed / closed / timed out / whatever. It essentially doesn't save the model into the object that I want but, after refreshing my browser, loads a project that has loaded a random dataframe (from that script but shouldn't be loaded at all) and doesn't load the model.

Would love advice if there's anything I can do on my end to free up space or troubleshoot. Happy to share details if you all can do something on your end. I'm running the R script to train the model as I submit this (fyi). Thanks!

It's possible that you're running up against the 1GB resource limit for cloud projects, which includes all loaded libraries, data and temporary files.

1 Like

Ah, that's a good point and likely the case. Any idea if there's stuff I can do on my end to minimize the resources used? Currently, I'm only loading the libraries and data needed to run the model. I'm also using rm throughout my script to drop dataframes. Finally, I recently deleted files from the project workspace although I'm not sure that counts against the 1GB limit. Happy to get advice before heading over to Google Cloud and spinning up an instance there just to run the script once.

1 Like

I can't offer any useful suggestions beyond running and decipering

cat(readLines("/proc/meminfo"), sep = "\n")

which will give you a somewhat cryptic information

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.