Hi, we have a set of Shiny apps for internal clients which share a common codebase and are differentiated with some config files. The general workflow is:
- Retrieve some data from a Redshift database
- Execute some R code that creates RData files with the necessary objects for the apps
- Publish the folder (app.R, RData, config file) to shinyapps.io
These steps have generally been done using some scheduling and performed on an AWS EC2 machine. I am thinking that there must be a more scalable way to do this, one consideration also being autoscaling of the published app. Aside from shinyapps.io, I am aware that we could host the app on EC2 and likely use Shiny Server. Are there best practices anyone uses? This essentially is similar to how one would maintain a BI workflow (e.g. Tableau): schedule a data extract that the dashboard then reads from. Thanks in advance!