Shiny memory usage

In my company we have developed a shiny application that I would call medium-sized (12k LOC in about 15 files), although I honestly have no idea what the usual size of a Shiny app is :slight_smile:

I was surprised to find out that each instance of the application takes about 600 MB of resident memory (Amazon Linux), even when just loaded and with no user interaction, and even more surprised that of this only 10% is shared memory.
We do need a lot of libraries, but this seems excessive.

Can you recommend a good way to hunt down what parts of the application are so memory-hungry?

I have even worse problem, my shiny apps not only big, but also keep growth by it self over time. I also want know a good way to profile shiny apps understand what's going on with it.

Have you taken a look at the resources in the Performance and Tuning sections of the Shiny site?

https://shiny.rstudio.com/articles/#performance

Thanks, I didn't remember that profvis also showed memory usage.

However, I get very different results doing the exact same profiling twice. It seems that profvis reports the memory usage of a library only the first time it is loaded, while in the subsequent times it skipped.
Do you know if that's actually the case?

And even in the first run I get much lower results than I get with ps. I'm not sure how to interpret these memory numbers from profvis...

OK, lessons learned:

  • The R interpreter itself it pretty big: about 60 MB baseline (compare with 4.6 MB for Python 3.6)
  • Shiny throws in another 10 MB just loading the library
  • But by far the culprit seems to be tmap, which is a monster at a whopping 300 MB :neutral_face: I'll try loading and unloading it on demand.

Thanks for your help!

2 Likes