How to fetch a million records into Shiny R faster from Database?

I am trying to fetch a million records from postgres database. The shiny dashboard works fine when there is a small amount of data but as the data increases the time being taken to run the app increases. It now takes about 2 minutes to load the application.

config <- config::get()
pool <- dbPool(
drv = dbDriver("PostgreSQL"),
host = config$host,
dbname = config$dbname,
port = config$port,
user = config$user,
password = config$password
)

onStop(function() {
poolClose(pool)
})

get_data <- function(pool) {
appdata <- pool %>% tbl('matview')%>%collect()
return(appdata)
}
appdata <- get_data(pool = pool)
View(appdata)

The matview is the materialized view with 2 million rows(All the aggregations have been performed and data is reduced to minimum possible level.16 columns with a combination of numeric, datetime and character strings). Processor Speed: 3.3GHz, Memory:16GB. Is there any way the speed could be improved?

Thank you!!
https://stackoverflow.com/questions/53121567/how-to-fetch-a-million-records-into-shiny-r-faster-from-postgresql

Have you by chance posted a similar question on Stack Overflow?

If so, then:

In general it is good manners either not to cross post, or if cross posting is necessary then state the fact. It can be awkward for a person trying to help to address your question on one site, only to see it pop up on the other.

1 Like

Yes thats the same question. My apologies I am new to this forum.
Thank you!!

No offense meant, and welcome (both here and to SO :slight_smile: )

There are many ways to deal with this. You could consider chunking your materialized view into smaller pieces and display the first chunk to the user while rbinding the rest behind the scenes. Another way is to sample the data and only perform a full query upon clicking a button.

1 Like

Unfortunately cannot display chunks to the user. I will try the second solution. Thank you!!