Nothing really sticks out based on what you are doing.
I think many of us are doing similar things, even with bigger file sizes.
some more suggestions off the top of my head, from easiest to hardiest.
- you can try suppressing the warnings and see if it makes any diff.
options(warn=-1)
documentation here
- I would try the suggestion of the 2nd link and source the code into console instead of "run all." I don't know what "run all" actually does behind the scene ... I never use it.
- see if there are differences in the read/write actions between the 2 environments. try a couple of bigger files and see if those are somehow taking more time in RStudio.
- track the memory usage and see the difference in memory utilization. Maybe RStudio is capping memory more so.
The best way to find the answer is to profile the code and see what the delta is between running in R vs RStudio.
If you plan to continue to do more work on large data sets and performance is an issue, I would recommend looking into data.table package. tidyverse is expressive but, like base R, not great with memory or performance time. I've gotten pretty good performance jumps with data.table.
Lastly, is there a need to run the code in RStudio once it's been developed?
I wouldn't run production code from any IDE.