hello every one,
I've published an app on rstudioconnect server. works fine just after publishing (good timing to run small calculation i need)
BUT the day after the appli is so long to access (90 sec) and 3 more times to make same tasks).
it's a dashboard app that monitoring production process.
it dont make stong calculation but i have multiple large .CSV (approx. 250Mo total ; 1year of produciton).
to keep up to date i re-read .csv every 30 minutes
when i monitoring the mem_used after each importation the amount of ram used step up each time by approx. 30MB.
from 500MB to 2GB+
dont understand why the memory grow like this.
my data takes approx.25 sec to load so i set runtime like this to get an always ready app
- max process 1
- min process 1
- max connection per process 50
- load factor 1.00
there is how i manage importation in global.R :
r <- reactiveValues(inputs= reactiveValues())
refreshTimer30min <- reactiveTimer(10006030)#en ms
extract$Production.Line.Name <- dplyr::recode(extract$Production.Line.Name, !!!ligne_list)
extract$tarepoche <-dplyr::recode(extract$Production.Line.Name, !!!tare_poche)
"%Y/%m/%d %H:%M",tz="")) ,
format = "%Y/%m/%d")
is this a mistake in my code way or runtime set?
Thanks for help.