Greetings, I'm trying to figure out if the targets package is suitable to our needs. Currently, I have a fairly large R package with a vignette that takes thirty minutes to an hour to run for bigger analyses. And users are regularly rerunning these. Based on the targets package description, it seems ideal. I've run through the example and tried applying it, but somewhat stuck on how this will scale within a large R package.
- Is there an example of using Targets Package supplied within another package to speed up a single function? How will this work with multiple functions?
- Since I have many functions, and each of those functions will likely have some same object names. How will targets package handle this? It seems the target objects are all stored in the same folder. It's less ideal for us to have every target object to be uniquely named accross functions.
- It seems to read the target objects you must execute
tar_read()
. If an object is read using that function and manipulated, do we have to write that object back for the targets pipeline to pick that up?
file_path <- "folder/folder/file"
data <- file_read(file_path)
- Say we have target objects dependent on the
data
object above. The code above always gets run. Will the targets pipeline be able to recognize whether the data is the same or changed?
Thanks for all/any help on this; excited for the possible benefits of the targets package! I've tried sifting through the documentation, let me know if I missed something.