I am trying to fetch a million records from postgres database. The shiny dashboard works fine when there is a small amount of data but as the data increases the time being taken to run the app increases. It now takes about 2 minutes to load the application.
config <- config::get()
pool <- dbPool(
drv = dbDriver("PostgreSQL"),
host = config$host,
dbname = config$dbname,
port = config$port,
user = config$user,
password = config$password
)
onStop(function() {
poolClose(pool)
})
get_data <- function(pool) {
appdata <- pool %>% tbl('matview')%>%collect()
return(appdata)
}
appdata <- get_data(pool = pool)
View(appdata)
The matview is the materialized view with 2 million rows(All the aggregations have been performed and data is reduced to minimum possible level.16 columns with a combination of numeric, datetime and character strings). Processor Speed: 3.3GHz, Memory:16GB. Is there any way the speed could be improved?
Thank you!!
https://stackoverflow.com/questions/53121567/how-to-fetch-a-million-records-into-shiny-r-faster-from-postgresql