package idea: blog storage and remote file system for shiny

I often find myself in a situation, where I want to take a (reactive) object from a shiny session, and write or read it to/from blob storage (say, Google Cloud Storage, ...) or a remote file system (say, via rsync, ..., or god forbid, Dropbox).

I use this on shinyapps.io, where no persistent local storage is available, but also on other shiny environments, because I generally like to consider my shiny boxes disposable.

R packages to talk to the storage backends, for the most part, already exist, but I got a bit tired of calling them ad-hoc in my server.R, so I've written some functions for internal use that abstract some things away, and add some (planned) bells and whistles, such as:

  • encapsulation in a module
  • some across-session shiny::throttle() or shiny::debounce() lest I overwhelm / race-condition the blob storage or remote file system
  • create unique filenames from session$token.
  • async via {promises}
  • some subtle ui feedback to let the user know what is happening, maybe via showNotification().
  • maybe parallelisation, though this might be overkill/irrelevant because most (?) shiny sessions are single threaded anyway (?).

I'm considering factoring this out into a little package, but I'm wondering:

  1. has this been done already and am I just too dense to find it?
  2. is this a bad idea of a really convoluted package? (wouldn't be the first time that's happened to me)
  3. what would folks want in such a package?

I am aware of these related resources/packages:


I also asked this on twitter.

5 Likes

This topic was automatically closed 54 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.