Are there good resources out there about how workflow and analysis evolve as they go from exploratory into a more polished product such as an “analysis package”?
It feels to me like there are significant differences in the workflow of an analysis from when it is a single script -> multiple scripts -> scripts + functions -> scripts + functions + tests + documentation, etc…
I have a decent understanding of how the beginning stages work, and from reading the ‘R Packages’ book, I am gaining in understanding of how a package works. I have seen some scattered wisdom and comments out there about how to fit a large data analysis into a package structure. I am having trouble in the morphing process from an exploratory data analysis that gradually grows in size and so I want to add package features such as function documentation and tests to make sure I am not making unintentional changes to the analysis as I refactor functions that perform parts of the analysis.
Just to give one example, package dependencies:
During the multi-script phase, I tend to have a script called
setup.Rwhich I can call at the beginning of any given sub-analysis to make sure I have the same set of packages loaded everywhere. I also use this to source function files.
[ … mysterious middle … ]
For the fully-formed analysis package, the setup.R script doesn’t work. Instead, each function should have roxygen-declared
@importFromor fully specified
package::funtype usage inside the functions.
But in the mean time, I just want to run some unit tests on my analysis functions – I don’t want to make the dependencies perfectly minimal the way you would for a package you expect others to be using. For now I have added @import tags for all the packages I used to load in setup.R, but R/devtools is constantly yelling at me and I am having tests fail because of package dependency stuff that isn’t really related to my analysis.
Anyway, point is: are there good resources out there about how workflow changes as the analysis grows and gains package features?