Hello. I am cautiously asking about this because I know my view is a bit controversial, as shown by the discussion here on Twitter.
In my view a core element in the above Twitter Thread is where Hadley Wickham states "Surely you should always keep code and data together". That is logical if your "project" is a once-off research study where all your code and data logically fits together.
But what about a case where, for example, you use a stand-alone script that reads data from a folder on a weekly basis. Which folder it is reading from varies from week to week. Also, the script may source some utility functions which are held somewhere else on a shared drive or in a folder that is not a sub-folder of the RStudio project.
In such a use-case, is it really that bad to have a line at the top of the script where the person running the script can explicitly set the location of the source data and/or any common scripts that need to be loaded before the main script is run?
In such a case, copying the shared utility scripts each time into a Project folder would seem to violate the DRY principle of coding.
This use-case is not really a standard "research-project" situation but more like an operational workflow.
Any views on this appreciated!