Speeding Up Shiny/Flexdashboard App

Hi friends,
I hope all of you are doing well and staying safe.

I have a general question and will very much appreciate any suggestion.
how can we speed up a shiny/flexdb app that queries from MS SQL and the data size is always a dataframe in the size of 30 columns and more than 30000 rows.
At this moment the most expensive calculation is using pivottabbler package to create a pivot table and using ggplot to creat column graphs based on pivot calculations

Regards

Maybe there's two ways to look at it. One, would there be a way for you to separate the code to do the pivot and queries outside of the apps code and have the app code just pull from the results? so create another script where the processing is done locally and just having app pull data.

Other option, maybe a parallel processing package? I haven't tried any of these personnally but they may be a option if your setup would allow.
https://www.r-bloggers.com/simple-parallel-processing-in-r/

also here
https://nceas.github.io/oss-lessons/parallel-computing-in-r/parallel-computing-in-r.html

1 Like

This topic was automatically closed 54 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.