Can putting functions into a package speed up Shiny apps?

I have ~1.5k lines of code defining functions and algorithms necessary to operate on user data that is input into my Shiny app.

These lines of code seem to contribute to the app's loading time just as function of needed to be read in when the app loads (in addition to my already hefty-sized Shiny code).

If I change from a .R file containing these functions - called in the Shiny code with source("myfunctions.R)" - and change to putting them all inside a library called with library(mypackage) - does that speed things up?

A few points on this:

Will putting your functions into a package speed up your code? Technically speaking, yes. When you call source, you are telling R "run 1,500 lines of code that define my functions". When you use library, you are telling R "there is now a new namespace myPackage, which contains my functions." Put another way - using source is an "eager" way of defining your functions, whereas using library is a "lazy" way of defining your functions. Consider the following example:

myfun <- function(x) {x + 1}
# You could put a bunch of code below this that never calls myfun
print(1:10)

No matter what you put below myfun, R will always run the code that defines myfun. Whereas, if you put myfun into a package, then unless you specifically tell your package to bring certain functions onLoad or onAttach, you just expose the names of your functions to your current namespace.
Also, for the sake of reproduceability, putting your code into packages tends to be a very good practice, so it is probably just a good idea for reproduceability, if nothing else.

However, unless you are defining functions in a non-standard way, I would be very surprised if defining the functions is the principal cause of your app's loading time issues. How sure are you that the functions themselves are the cause of the slowdown?

For example:

microbenchmark::microbenchmark({
    x <- function(y) {
        x + 1
    }
}, times = 1000)
#> Unit: nanoseconds
#>                                            expr min  lq  mean median  uq  max
#>  {     x <- function(y) {         x + 1     } } 100 100 156.3    100 200 7200
#>  neval
#>   1000

Created on 2022-10-31 by the reprex package (v1.0.0)
Extrapolate this out to 1500 lines of code, and this would take about 1/1000 of a second...

Fantastic explanation, thank you @dvetsch75 ! That's just what I wanted to know, and explains why you can use a function from a huge library (tidyverse, for example) without having to read in /define each of said functions beforehand.

How sure are you that the functions themselves are the cause of the slowdown?

I'm certain is a cause, but it's definitely a compounded problem. I hesistate to say "problem" because the app isn't terribly laggy, it's just that the loading time takes longer than I'd like.

The reason I know the functions.R file is a cause of loading delay is because I can start the app much more quickly if I do a test where I leave that file out.

Thanks a lot, I will look into putting the functions into a package (looking at putting on GitHub rather than CRAN)

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.