Automatically reconnecting to PostgreSQL database

(Also posted a version on Stack Overflow but didn't get any answers.)

Does anyone know of a way to maintain a stable connection to a Postgres database with existing connection management packages like pool, DBI, etc.? In my case, I need a Plumber API to connect to a Postgres database and be resilient to the database being replaced by a new version (as part of a daily data refresh), but my scenario would also apply in the case of a Shiny app connecting to a DB.

The general issue is that if you want to create a new database and promote it to production via a simple rename, you must first disconnect existing connections to the old one. None of the DB-related packages I've tried have logic that attempts to reconnect to the database when the connection is lost. The only way I know to fix it is to restart the app, which calls dbPool/dbConnect again.

I believe this same problem would also manifest in the case of a DB server restart, which with some hosting providers might occur at an unpredictable time.

Has anyone ever dealt with this problem? Is there a way to capture an error like Error in postgresqlExecStatement globally and attempt a reconnect?

How do other, non-R, applications handle this scenario? It is hard to envision how would this work if the new database would have to first authenticate the user, and that can only be done by restarting the DB session

I know that auto-reconnection to MySQL databases has been a feature in Rails for a while:
https://guides.rubyonrails.org/2_3_release_notes.html#reconnecting-mysql-connections

In the case of Postgres+Ruby, it seems you need to manually catch the exception and reconnect yourself. It's pretty easy to do that since monkey-patching classes/modules in Ruby is so easy. I don't have experience with other languages' solutions to this problem.

My current workaround is this:

pool <- dbPool(
  drv = PostgreSQL(),
  host = Sys.getenv("DB_HOST"),
  port = 5432,
  dbname = Sys.getenv("DB_NAME"),
  user = Sys.getenv("DB_USER"),
  password = Sys.getenv("DB_PASSWORD")
)

safe_pool <- function() {
  tryCatch({
    # try a fast query
    dbExecute(pool, "SELECT 1")
  }, error = function(e) {
    if (e$message == "RS-DBI driver: (could not run statement: no connection to the server\n)") {
      # create a new pool in enclosing environment
      poolClose(pool)

      pool <<- dbPool(
        drv = PostgreSQL(),
        host = Sys.getenv("DB_HOST"),
        port = 5432,
        dbname = Sys.getenv("DB_NAME"),
        user = Sys.getenv("DB_USER"),
        password = Sys.getenv("DB_PASSWORD")
      )
    } else {
      # there was an unknown error
      stop(e)
    }
  })

  pool
}

A more complete reprex is here:

It's clunky and technically turns n queries into n+1 queries, but it works. At some point, I might work on a cleaner solution that wraps the actual functions I'm using (e.g. tbl) in order to avoid the test query. Really, I think something like this it should be added to the pool package but I don't personally have the R chops to do it.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.