I have an app which loads a large amount of data using a function I have created (load_all_data()). The data is stored in the global environment due to its size (can't just run the load data function for each user). I need the load_all_data() function to run each week, but no more, to refresh the data.
I have drafted the below script based on a similar (but different) question in the forum. I just want to check I'm not doing anything foolish? or approaching this in the wrong way? Thanks for any thoughts.
data <- load_all_data() # Function that downloads data and organises it into a dataframe
date__data_updated <- Sys.Date()
ui <- fluidPage(
# Nothing required in ui
)
server <- function(input, output, session) {
data <<- reactivePoll(1000, session,
# If the data was updated more than 7 days ago then the new date is passed on to be evaluated, this will be different so trigger the update
checkFunc = function() {
if (date__data_updated + 7 < Sys.Date())
Sys.Date()
else
""
},
# This function updates the data
valueFunc = function() {
load_all_data()
}
)
date__data_updated <<- reactivePoll(1500, session,
# Same as above, but slightly slower so it triggers after the data has been updated
checkFunc = function() {
if (date__data_updated + 7 < Sys.Date())
Sys.Date()
else
""
},
# This updates the date the data was last updated
valueFunc = function() {
Sys.Date()
}
)
}
shinyApp(ui, server)