I have developed a few apps. The apps are deployed on a local server using Dokku, Docker and Shinyproxy and are used multiple times a day by colleagues from my company. Some of the apps run without problems because only small calculations are processed.Other apps perform larger calculations with larger amounts of data, which results in a very long loading time.
The data the apps access is all stored in a PostgreSQL database, which is also located on the local server.
For example, one of the apps calculates the delivery capacity for more than 6000 items for the next 12 months. Users have the possibility to filter the articles by manufacturer. The filter operation takes about 20 seconds.
Another app has a search function that allows users to search more than 300,000 reviews for articles by keywords with some additional statistics. The search sometimes takes up to 60 seconds.
Such loading times make the apps almost unusable, especially since the amount of data is constantly growing.
My question (ignoring the performance of my R code) would be, if the infrastructure can be made more performant by adding other, to me unknown technologies (Distributed Systems, Spark, i dont know..)? Or is Shiny simply reaching its limits in use cases where the amount of data and the effort of computing operations are large?
Are there maybe helpful use-cases that show ShinyApps in enterprise contexts?