I have a long script that requires a lot of packages to prepare data before eventually running in parallel (forked on linux). I've been running into memory issues across all the parallel workers and have managed to improve this a lot by removing unnecessary objects and using gc() etc. However all of the loaded packages get passed into the memory of each worker, and the vast majority of them are unnecessary. Detaching packages doesn't actually seem to free up memory (it literally increases it in some cases??). I tried using .rs.restartR() in the script, but this has the effect of stopping the rest of the script from executing.
Is there any way to actually free up memory when detaching packages? Would it be best to use socket clusters and then pass objects / load packages that are necessary?