Deploy with sparkR

Hello! Can we deploy an app with sparkR to shinyapps? I'm trying to publish this basic app below. It runs locally, but it's not published. Any suggestion to make it work?

Thanks in advance!

library(shiny)

Sys.setenv(SPARK_HOME = "/opt/spark")
library(SparkR, lib.loc = c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib")))
sparkR.session()

dados<-read.df('materiais.csv',
                   header = "true", 
                   source='csv')


ui <- fluidPage(
    
        verticalLayout(
            tableOutput("media")
        )
        
)

server <- function(input, output) {
    
    output$media <- renderTable({
        head(dados)
    })
}

# Run the application 
shinyApp(ui = ui, server = server)

Hi, @alefego , welcome to Community!

Right, bottom line is that SparkR is not on CRAN (CRAN - Package SparkR) So, there's no common source from which shinyapps.io can pull from.

An alternative would be using sparklyr. As far as we know, there are no restrictions from running Spark in shinyapps.io. But since each session is containerized, then you need to install Spark every time the app starts, as in sparklyr::spark_install(). Also, you may need to ensure to explicitly set the Java memory size required for the app.

Having said all that, the main question is "why?". Is there something in Spark that is not available in R that you need? If so, can you share what that is? The cost of instantiating Spark for every app session seems to be a high price to pay, in performance terms, specially if the data conversion or model could be run using R