pandoc consumes large memory (>20GB) in Rstudio

This issue is similar to the previous topic with no answer. I have the same issue on a mac laptop (a 10-year-old model that still uses the intel chip).

I found that there is a memory leak with pandoc when using R in Rstduio. This happens when I click to evaluate the chunk. I experience this issue since last month and have no clue why is that. I tried to upgrade the R and Rstudio and they do not solve the problem.

I also found that it cost ~4Gb ram after loading the tidyverse library. Therefore, I force re-install the library, but it also did not solve.

May I know if this is a known bug and what is the mitigation methods?

Another thought is: I recently edits the .Renviron. I added a line R_MAX_VSIZE=16Gb, but not sure if that is the cause. I really need that line to be able to execute some large matrix multiplication. So, if that is the root of the problem, may I know if there is any method that can safely allow R to use all (or nearly all) the memory physically installed?

Thanks!

See

man ulimit

which defaults to 8gb ($2^23). This limits how much memory is released to a process by the OS.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.