Showing out of memory while running the shinyApp

Hi Everybody..!

I found error "exit 137" due to running out of memory.

Can anyone let me know:

  1. What is the maximum size of App we can load in shinyserver.io ?
  2. My files size total to 85.6 MB, is it large ?
  3. I am using free plan of shinyserver.io, how to check the memory usage ?

Thank you.

See the documentation

shinyapps.io hosts each app on its own virtualized server, called an instance. Each instance runs an identical copy of the code, packages, and data that you deployed; collectively, this is called an image. The bundle size that can be uploaded is limited to 1 GB for the Free and Starter plans, and up to 8 GB for the Basic, Standard and Professional plans.

For the paid plans there is a configuration option to increase memory rsconnect::configureApp("APPNAME", size="small") is the form up to ``xxlarge` for 8192 MB.

Note that the link also states that even with the top 3 paid plans

Do not set the maximum size larger than the limit for your shinyapps.io plan, or application deployments will fail.

Also, you need to factor in not only the size of your files but of all the packages in your environment.

The new lobstr package will show you the image size. For example, base R

> library(lobstr)
> mem_used()
51,664,512 B

Add tidyverse

> library(tidyverse)
── Attaching packages ──────────────────────────────────────────────────────────── tidyverse 1.2.1 ──
✔ ggplot2 3.1.0     ✔ purrr   0.2.5
✔ tibble  1.4.2     ✔ dplyr   0.7.8
✔ tidyr   0.8.2     ✔ stringr 1.3.1
✔ readr   1.3.1     ✔ forcats 0.3.0
── Conflicts ─────────────────────────────────────────────────────────────── tidyverse_conflicts() ──
✖ dplyr::filter() masks stats::filter()
✖ dplyr::lag()    masks stats::lag()
> mem_used()
71,262,184 B

If I understand the docs correctly, your image includes all the libraries referenced in your app, whether or not they are loaded. If you open a local session and load them all, mem_used() will tell you how much of your image goes towards that end. I don't think that you can get around this by require or dplyr::select methods.

Avoid the use of intermediate files.

source <- [some file]
reduced <- [subset]

Instead use pipe operators %>%

source <- somefile %>% filter[something] # or select or deselect columns in a dataframe

If you're still bumping up against the limit, you may have to up plans.

3 Likes

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.