Using cron to import data daily and update Shiny App

I've built a hockey animation app and deployed it on the Free Version of Here is the link: NHL Play-by-Play App

(I've published the raw dataset, couple images, and the R script that contains the Shiny code to )

Being a MacOS user who doesn't know much about scheduling tasks, I'm trying to use crontab to schedule a R script once a day and load in a .Rda file of a raw hockey dataset.

I understand how to use crontab, but that's all I know at this point.

I am having trouble conceptualizing my task, which is to schedule a R script to load in the .Rda file, somehow upload that dataset to, and then update my app with the fresh dataset.

If you could give me some advice on how I'd approach this task, I'd really appreciate that!

Hi Jason,

You can't upload individual data files to without redeploying the whole app so the best thing to do is have the app read in your data from a remote data source i.e. google drive, dropbox, or a remote database. You can then have a scheduled cron job update the file your app reads in.

This article should help with how to read data from external sources into your shinyapp:

And this is an example of an app that updates a googlesheet with hourly crypto currency prices which is then read into a rmarkdown shinyapp hosted on

1 Like

Hi Paul,

So, the data I'm importing is a .Rda file that lives on some server. I guess I could schedule a cron job to update my app's data with the new .Rda file. However, i'm not sure how scheduling a cron job would work. Do I write a new .R script?


Yeah you would create an R script that has the code required to update your data file then schedule a cron job to run the script at your chosen time interval on the server.

You can use the cronR package to do this all in R. To set up the job the code would look something like this:


r <- cron_rscript("update_data.R")

cron_add(r, frequency = "daily", at = "00:00", description = "update data")

You can these use cron_ls() to check it has registered the job.

To get this new data into your app on you would need to have something in your update_data.R script that uploads the updated file to a remote data storage space that you can access and load into your app on launch. Check out the persistent data storage article I linked to previously for examples of how to do this.

I would recommend the googlesheets package if the data update is only a few additional rows each update, or googledrive if there is a lot of new data each update or a new data set entirely, because adding data to a google sheet row by row is slow. If you're familiar with setting up remote databases you could try that option.

However, if you host the app itself on the same server your data lives and is updated with cron using shiny server open source or pro, it makes things a bit simpler because you can simply have the app point to your data file on the local system (use reactiveFileReader if the file updates regularly).

Hope that helps!


Hi Paul,

Thank you for this lengthy, very helpful response!!!

Also, I can understand your answer really easily by looking at your crypto_tracker repo. In coin_scrape.R , it seems like you uploaded the updated file to a remote data storage space (googlesheets) that you can access and load into your app?

Yeah in that case the coin_scrape.R script is executed every hour on a cloud server using cron. It only pulls 2 rows of data so I opted to use the googlesheets package to append the new rows of data to an existing sheet I setup.

Then the shinyapp on pulls in the data from the google sheet every time it is launched so it is always using up-to-date data.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.