I have a peculiar issue with an R script that contains some machine learning processes (specifically h2o, which makes use of java behind the scenes) which works perfectly across every physical machine that I have ever used it on. My problem is that when I try to use it on the Virtual Machine given to my by my company (which has twice as much RAM as my most powerful desktop computer), the R script runs out of memory.
At first the R script gave me an error on the Virtual Machine, saying that the paging file is too small for the operation to complete. So I adjusted the paging file, and now the Virtual Machine will run the R script, but it (specifically java) will use up multiple times more RAM than any other machine I run this R script on. The Virtual Machine will use up more and more memory until the process fails and crashes R. If I run this script on any other machine, it will work fine.
I have tried to turn the page file off, let the computer manage the page file itself, and have tried every allowable configuration of page file memory usage and nothing works. The Virtual Machine will either just not work, or will consume excessive memory until it crashes.
Would anyone have any idea what I could try to get this Virtual Machine to work? My IT department has been...less than helpful.
Thank you very much for your time.