Here's the situation:
For whatever reason the source data is in data frames, in memory, and not in a database. The goal is to leverage a large set of existing SQL statements against this. I'm just looking for general advice because I don't think I have a complete picture of all the solutions for this problem.
The solutions I thought of:
- Use the sqldf package against these dataframes.
- Convert the data frames to SQLite and query that via the SQL statements and DBI functions.
Is there a strongly preferred choice here? Any real pluses or minuses? Is there an even better solution than either of those two?
Right now I'm leaning towards #2 because SQL RMarkdown chunks are a feature that could be used after a conversion. I'm seeking this general advice because it's a problem I've never encountered personally, but I can imagine it's come up many times before.