Access in-memory R object created in other session

I've asked a similar question here and received a much appreciated but ultimately discouraging answer, so I thought I'd rephrase a bit in the hope of getting help.

The situation: I have an application which reads a large quantity of data at high frequency
over a network connection, which it gathers into batches in memory, and then writes
to a MySQL db. The app is often up and running, gathering away, and it is
therefore initiated on a schedule via an Rscript run from the shell.

The problem: I'd like to see the most recent batches of data without doing an expensive read
from the database and/or hitting the disk.
After all, the data are somewhere in memory, and -for reasons of speed- I'd ideally be able
to, for example:

  • read that data into some shiny dashboard on an ad hoc basis, or
  • fire up a fresh Rstudio session and read the current memory batch into a fresh R object for some ad hoc analysis

Two ideas that's I've sort of played around with, unsuccesfully so far:

1- have the app create a fresh environment when it is first run and save the name of this env
at someplace on disk, which could be used as a sort of address for a new Rstudio/ shiny/ etc session

2- have some in-memory db to which the database batches are written and from which other local apps could read, eg:

con <- DBI::dbConnect(RSQLite::SQLite(), ":memory:")
DBI::dbWriteTable(con, "batch_table", my_data_batch)

But, as I say, I can't get either of these approaches to work.

I'm open to suggestions and/ or criticism ("your whole idea is bad and here's why").
Thanks in advance.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.