I writing this on behalf of an analyst that I work with here and we're having a problem with receiving OOM occurrences. This is for RStudio running on Linux and her using the browser to do her work. We first set her up on a VM with 16GB of RAM and some of her runs caused the system to hang requiring a hard reboot. We then provisioned her a VM with 32GB of RAM and it is still happening and I think it will continue to happen even if we provisioned one for her with 64GB. I am completely unfamiliar with RStudio so forgive me, I'm a systems engineer.
I asked her for some information and she said she will need to do data sets ranging from 10-15GB in size.
From her words:
What I have been doing include:
data preprocessing, such as removing the html tags, phone number, email address
Chunking data, which is to chunk a large text into several smaller texts
And finally running the model, which is using one-layer neural network to build word embedding (machine learning)
It might be that whatever she is doing would cause any system to OOM being that she may be doing things wrong or in a bloated way that could be optimized.
My question is, is there a way that I can set a hard limit for RAM within RStudio, or Java. This VM is on Linux kernel version 4.x so updating /etc/security/limits.conf or setting ulimit won't work as that was a fix for a much earlier version of the kernel.
Thank you for any advice and if you'd think that she might be doing an operation that would fail regardless of the RAM, I can have her sign up here and describe what she's doing.