Shiny Future Promise Implementation in Databricks

Hi,

I am currently trying to write a shiny application within databricks which takes a few standard user inputs/filters - date, company, a upc list, etc. and then uses this to run a sql query. This query typically takes about 5minutes to run. After jamming up the cluster a few times, it came to my attention I should be using the heartbeat workaround in addition to promises and future for performance. I have implemented what I believe to be what is faithfully following the documentation, but I still receive errors. I will attempt a reprex below:

sql_string_raw <- reactive({
req(x())
req(y())
req(z())
req(a())
req(b())
req(c())

sr <- glue("FUNCTIONING_SQL_HERE"
)
sr
})

#Async Portion

raw_sql <- eventReactive(input$BeginAnalysis, {

busy(1)
val <- sql_string_raw()
future_promise({
s <- sdf_sql(sc, val)
s
}) %...>%
sdf_collect(.)

busy(0)
})

#Filter the Promise

munged_clover <- reactive({
req(raw_sql())
something <- raw_sql() %...>%
filter(!is.na(seg)) %...>%
group_by(seg) %...>%
summarize(METRICS_HERE) %...>%
ungroup(.) %...>%
mutate(METRICS_HERE) %...>%
relocate(a_column) %...>%

something
})

How is this incorrect? Any insight would be appreciated! I'm not sure if the promise is considered a logical object per the error message below.

---ERROR MESSAGE--
Use spec() to retrieve the full column specification for this data.
:information_source: Specify the column types or set show_col_types = FALSE to quiet this message.
Warning: Error in as.promise.default: Don't know how to convert object of class logical into a promise
168: stop
167: as.promise.default
165: then
164: %...>%
133:
117: munged_SOMETHING
116: renderTable
115: func
102: renderFunc
101: output$SOMETHING
21: runApp
20: print.shiny.appobj
18: capture.output
Unhandled promise error: invalid connection
Unhandled promise error: invalid connection

Hi, is the cluster in use Hadoop? If so, you may be able to drop the need for a Spark session and use a Hive JDBC or ODBC connection to perform the SQL query directly against the Metastore

Cluster is running spark 3.1.2

sc <- spark_connect(method = "databricks")

This topic was automatically closed 54 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.