I am starting to work with data sets of size 10k observations x 150 predictors and using things like random forest or NNs (keras), my environment will abnormally terminate, crash, lose connectivity and force me to log off and resume. This happens quite predictably after only a few iterations on these programs and sometimes I can't even get a very small batch of epochs done on a simple image recognition tutorial.
A few questions:
- what are the techniques / practices to help clean up memory and sustainable use of workspace.
- what are the limitations on the free account and where do you go next if you know you will want to get beyond these?
Apologies if already asked and answered - I had a brief search through and couldnt find an answer.
Thank You
Adrian