I'm trying to fetch a large set of query results from a database, although I don't need to store them all in memory at any given time. I was hoping it would be possible to stream them (unsure if this is the correct terminology) in chunks (of e.g. 1,000 rows) and process them in R as they come in, therefore keeping only the latest chunk in memory.
Has anyone tried this with success before? Is it possible to do this in tandem with Rmarkdown's SQL code chunks (return a lazy iterable, rather than an actualised dataframe from
Thank you for any help.