Once you called a script that was deployed as a plumber API to Connect, the Sandboxing rules apply.
This basically means that your process runs in a protected space and any information you write to the local file system is only available to the process while it is running. In other words, if you were to simply use writeLines(...), you won't be able to access this output outside of Connect.
The solution to this is to write to a destination that you can access outside of Connect. In general, this can be:
- A database
- Cloud storage, e.g. AWS S3 storage buckets
- A cloud service, e.g. Google Sheets
- A shared folder on the Connect server, i.e. a top level folder you create on the server and have access to
- Network file share
For example, earlier this week I deployed an API that takes incoming data and records what it did by writing a new line into a Google Sheet. This is quite easy to do, thanks to the googlesheets package.
Here is the real code I deployed to Connect as part of the proof of concept:
library(googlesheets)
if (!file.exists(".httr-oauth"))
stop("You must run googledrive::drive_auth() first")
gs <- googlesheets::gs_key("............................................")
record_in_googlesheet <- function(id, title, score){
newline <- data.frame(timestamp = Sys.time(), id = id, title = title, score = score)
gs_add_row(gs, input = newline)
}
#* Predicts ticket complexity given the initial issue.
#*
#* @param title Ticket title
#* @param id Ticket id
#*
#* @post /
function(id, title, description, org) {
score <- runif(1)
record_in_googlesheet(id, title, score)
score
}
Of course, in your case you are writing logs in a the futile.logger file format, so you still have to solve this problem.
Also, you will have to create a folder on the Connect server, e.g. /app-data that you have access to.
The support articles Persistent Storage on RStudio Connect might be helpful, since it describes some of the ways of working with persistent storage in Connect.