I know that auto-reconnection to MySQL databases has been a feature in Rails for a while:
https://guides.rubyonrails.org/2_3_release_notes.html#reconnecting-mysql-connections
In the case of Postgres+Ruby, it seems you need to manually catch the exception and reconnect yourself. It's pretty easy to do that since monkey-patching classes/modules in Ruby is so easy. I don't have experience with other languages' solutions to this problem.
My current workaround is this:
pool <- dbPool(
drv = PostgreSQL(),
host = Sys.getenv("DB_HOST"),
port = 5432,
dbname = Sys.getenv("DB_NAME"),
user = Sys.getenv("DB_USER"),
password = Sys.getenv("DB_PASSWORD")
)
safe_pool <- function() {
tryCatch({
# try a fast query
dbExecute(pool, "SELECT 1")
}, error = function(e) {
if (e$message == "RS-DBI driver: (could not run statement: no connection to the server\n)") {
# create a new pool in enclosing environment
poolClose(pool)
pool <<- dbPool(
drv = PostgreSQL(),
host = Sys.getenv("DB_HOST"),
port = 5432,
dbname = Sys.getenv("DB_NAME"),
user = Sys.getenv("DB_USER"),
password = Sys.getenv("DB_PASSWORD")
)
} else {
# there was an unknown error
stop(e)
}
})
pool
}
A more complete reprex is here:
It's clunky and technically turns n queries into n+1 queries, but it works. At some point, I might work on a cleaner solution that wraps the actual functions I'm using (e.g. tbl
) in order to avoid the test query. Really, I think something like this it should be added to the pool package but I don't personally have the R chops to do it.