hi @Jayrus, I am wondering where/when you are loading the libraries?
If you define it in the right scope, the libraries should only load at startup of the app (and not at each user session) and I think that should be fine?
As previously mentioned, you should "load" your library only once per session. The easiest way to achieve that is to add all your library(pckname) in a file that is named Global.R by Shiny convention.
Another source of issue I saw once, was the repeated installation of the library before being loaded.
All in all, you should install once and load only once at app session startup.
Another option to consider is not to load libraries if you are going to use only a few functions from them, and reference the functions this way instead.
What exactly do you mean by "loading library"? Do you mean calls to library(foofy)?
How have you established that the bottleneck is in library loading? (i.e. what evidence are you using to determine that this is the slowest part of your app).
(@andresrcs using :: instead of library() is unlike to save much time, because the first time you use :: with a package, it still has to load the namespace; and that's most of the time taken by library. @jm_t second and subsequent calls to library(foofy) do relatively little work, so while they're unnecessary, they're unlikely to be the cause of a significant slow down)
I can't disagree with this, but in my experience, the time reduction it is at least noticeable.
I had recommended this approach because I saw an improvement of 5 seconds on a scheduled R script that I use for data collection, just by replacing library() with :: and in my use case, that little much, was meaningful, although I recognize I can't generalize solely based on this experience.
EDIT: Now that I think about it, maybe the difference that I saw, was do to the fact that usually when we use library() calls we put them at the beginning of the script and when we use :: the namespace is loaded in the moment the line of code gets executed. I naively was judging the time reduction based on time stamps recorded in my database, so I was fooling myself.
Sorry for the misleading comment
The reason why I think call library(foofy) is problem is empirical evidence. Once I started my Docker container, my shiny global.R file or Rprofile setting ( both are exactly same ) calls almost 20 libraries and takes almost 2000ms(2s). After running Docker, this bottleneck is not the problem. Because I use Docker container for addressing every concurrency users, this cold start time(almost 8s.... lol) is problem.
I observed all this activity by Google Chrome Developer tool.
I would try to measure with the default shiny app as a benchmark. If your cold startup is still in the same range, then the number of libraries is not the issue.
What R profiling tools did you use to determine where the bottleneck is in your R code? It sounds like you are using the chrome developer tools, which is not going to give you much insight into your R code.
When I developed shiny app, it was the problem too. The cold start time of default shiny app is almost same as shiny app from docker container. I think the reason is the library time. (this can be the wrong guess)
I used shiny load test tool to test response time of shiny app component. I figured out component inside shiny app is not the problem.
I also tested the load and stress of shiny app with JMeter and chrome developer tools. The cold start time was almost 2 seconds, and docker container run time was really fast. Also, I can observe behavior of R through interactive command line in docker container. Through these points, I concluded library loading(calling) time is the problem.