From Joe Cheng's rstudio::2019 keynote...
- Optimize
- Move work outside of Shiny (very often)
- Make code faster (very often)
- Use caching (sometimes)
- Use async (occasionally)
Efficiency:
- If you have constant data wrangling procedures, they should be moved outside of the shiny application. (Perform once, not
n times.) . The final result can be saved to an RDS file that can be loaded outside of your server function. This means it'll be read at application start, not at every user session.
With everything being directed under a FIPS code, I would make a list object with the keys as the FIPS code and the values as a list of the subsetted sf data and subsetted data table. This whole object can be stored in a single .rds (?readRDS) file that can be read outside the server function.
Given a users selection for the FIPS code, pull the value for your list object.
Example ideas:
# some_script.R
all_data <- list()
all_data[["FIPS1"]] <- list(
spatial = subsetted_spatial,
dt = subset(aspatial, FIPS == "FIPS1")
)
#...
saveRDS(all_data, "all_data.rds")
# app.R
library(shiny)
#...
all_data <- readRDS("all_data.rds")
server <- function(input, output) {
# use all_data...
}