Free Workspace Limitations?

I am starting to work with data sets of size 10k observations x 150 predictors and using things like random forest or NNs (keras), my environment will abnormally terminate, crash, lose connectivity and force me to log off and resume. This happens quite predictably after only a few iterations on these programs and sometimes I can't even get a very small batch of epochs done on a simple image recognition tutorial.

A few questions:

  • what are the techniques / practices to help clean up memory and sustainable use of workspace.
  • what are the limitations on the free account and where do you go next if you know you will want to get beyond these?

Apologies if already asked and answered - I had a brief search through and couldnt find an answer.

Thank You

Adrian

Hi @Adrian68! Welcome!

Is this question about RStudio Cloud? (if so, could you please move it to the #RStudio-Cloud category? You can do this by clicking the little pencil icon that appears at the right side of the topic title when you're scrolled to the top of the page)

Hello,

If you can provide the url to the project I can try to confirm. From your description it sounds like you are hitting the memory limit (which is currently only 1GB). We are currently in alpha and we do not presently support higher memory tiers.

There was another user who was using an R package called ulimit to explore their memory footprint while using cloud. So if you want to work within the limit I would suggest trying that.

Sean

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.