Shiny with Docker container network: Create an Shinyapp in one container but using the R environment from another container

Hi everyone,

I have an R environment container configured to what I wanted it to be.

Now I want to create a shiny app for it but I do not want to contaminate the clean environment I made before. Therefore, I am thinking of creating another container dedicated for hosting the shiny app (server.R, ui.R, etc.).

So I want the second container (the shiny app) to run on the first container's R environment via some kind of connection. As far as I know this is a Docker network.

Is there a way to accomplish this? In the future I am also considering hosting both containers on a remote server so the Shinyapp can be accessed publicly.

I Googled related topics but there were only articles talking about how to host a Shinyapp within the first R environment container.

Any help will be much appreciated.

Could anyone help? I would appreciate a lot if there is any.

I don't really understand what you're trying to accomplish here.

Shiny apps will run in whatever environment you choose to deploy them. They won't contaminate the memory of other R sessions, though may write to files used by other R sessions if programmed to do this rather than using temporary files.

The standard approach to making Shiny accessible to the public is to deploy them via Shinyapps.io, RStudio Connect, or Shiny Server Open source. An alternative Shinyproxy runs containerised apps.

1 Like

I was thinking about separating the front end and back end like what people would do for web programming. In this case, I want to the Docker container that contains the R environment to be the back end and the Shinyapp container to be front end. So the separation between these two layers become clear and easier to maintain.

Ah I see what you mean. Yes I used to do exactly this with node.js & R backend containers in a Kubernetes cluster, with the Shiny App being hosted by Shinyproxy.

I've since moved to using RStudio connect, & I've moved most of my node.js backend to google cloud functions so I only pay for when they're invoked, rather than having a VM sitting idle. I've previously moved one R backend to become a Google Cloud Run API, but itd have been much simpler & faster to deploy it as an API on RStudio Connect.

If you already have the containerised R backend you have options! You could run both the backend and Shiny app in the environment e.g. a docker swarm or Kubernetes cluster, probably using Shinyproxy to host the Shiny container. Alternatively you could deploy your R backend as a containerised API on e.g. Google App engine (GAE) or as a Plumber API in Google Cloud Run or on RStudio Connect, then call it from wherever you choose to host your shiny app - with the right firewall/permissions on your backend API the shiny app could then be hosted anywhere (Shinyapps.io, RStudio Connect, Shiny Server Open Source or Shinyproxy).

The advantage of the second method is that it's simple to change where you host your Shiny app if your needs change.

This topic was automatically closed 54 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.