General advice: R shiny app to perform analysis on client data

Hello,

I'm trying to write an R shiny webapp which

  1. Accepts client data (includes excel files, .png images and .h5ad files)
  2. Transfers and stores the data to a local HPC cluster
  3. Generates an analysis report which must then be emailed back to the client.

Additional info:

  1. The analysis pipeline is computationally heavy, but I have access to HPC cluster running Linux which is operational constantly.
  2. I would like to host my app through the internet.
  3. I do not expect too many concurrent users as this is a niche application.
  4. I might need additional support services like a database or ETL processes, but I'm unsure how to integrate those with Shiny.
  5. While I have a working analysis pipeline and a basic knowledge of how to write shiny apps, I'm unsure of how to tackle this problem, specifically the part regarding automatically transfering, storing and analyzing client data.

Is anyone familar with relevant tutorials or reading materials for these specific tasks and constraints? Any help would be appreciated.

Best,
Abhay

Do you need to do this all in a Shiny app? Could Shiny be the front-end to capture data locally, and then a second scheduled job to move or process the data in your HPC cluster, then email the result? How instant does this need to be?

Sounds like you should check out dbplyr. It is a wonderful backend for databases like SQL and works with R's tidyverse. It's very useful for Shiny because you can store your data outside the Shiny app itself and then use dbplyr to do 'lazy evaluations' - just getting the data as needed and not slowing down your app.

You could host your app on a Linux machine you have access to using shinyserver, or put it on shinyapps.io.

This topic was automatically closed 54 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.