Best practice to maintain PSQL+R Shiny connections

I've built an application that does a lot of processing of user data (they can load in their data, map the variables, run analyses, review a dashboard, and download the results/report). It's a pretty heavy application, and I'm running into an issue that I'm not sure how to best solve for.

The problem is that sometimes the session will unexpectedly disconnect from the psql database. This causes problems because just about every corner of the application depends on retrieving or sending information. Basically, the app doesn't work at all without the connection. What's even worse, is the UI doesn't really inform the user of the problem, it kindof sits all lazy-like.

The application exists on an EC2 instance within a docker container, served through an HTTPS proxy (Caddy) to the public via a registered domain name. Each new session searches for a global pool connection, and if one does not exist, it creates one, then checks out a local connection and passes that into all the downstream modules.

I'm wondering how others have addressed this problem. Should I,

  1. use a global pool, then check out a single connection and test for a severed connection at the start of each function? This is my current (unfinished) approach and seems not great.

  2. search for a pool connection and checkout a connection at the start of each function, then return at the end? This would take a bit of time to implement (and test), but seems like a reasonable solution.

  3. check for a connection every minute and if one doesn't exist, create one. I'm guessing this would need to happen in each module independently.

Any direction will be greatly appreciated.

Thanks,

This topic was automatically closed 54 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.