Well, of course, it depends. How big is the database, how often does it change, and how often are reports needed?
Assuming that this is a periodic, rather than a real time project (time to hire a data engineer); bring your query into a tibble (maybe you can cron this during the day), process, format reports with RMarkdown, deploy to Shiny,
If you volume is moderately high, consider RStudio Server (disclosure:I get nothing from them except great open source software that everyone else does).
Finally, although I'm far from expert, cloud your data base and all you have to worry about is net latency. The data from Redshift, say, will come faster than you can swallow it.
In my view, the point of using R is to limit DB work to storing the data--bring as much as you need into R and work there,