30 minutes to complete devtools::load_all()

Hello. I am currently developing an R Shiny application which I have put into an package. Development has ground to a halt as devtools::load_all() routinely takes over 30 minutes to run and reload my package. I suspect that this may be due to the amount of data that I have included inside my package. This is because I have added large chunks of data at several time points and each time results in a worse experience of package development.

This data is utilized in various places of my R-Shiny application for analysis and visualization purposes. I have 45 .rda objects stored within my data/ folder that bring the folder size to ~700 MB. The usethis package was used to store and document this data.

Has anyone else experience significant slow down with package development when including more datasets? Is there a better way to handle this data?

One option I have thought about is creating an external package that has the sole purpose of holding holding the data (R packages - internal and external data - coolbutuseless). My thinking is that this would speed up the development of my R-Shiny package since the data objects are no longer included in that package. Since I would not update the new "data only" often, having a 30 minute devtools::load_all() call is not a big deal.

I think moving your data to another R package is a good idea. However it sounds like your data is very large, if it takes 30 minutes to load. There's some unnecessary overhead if all your package purpose is to move data around.

Consider hosting the data on an s3 server or somewhere that you can download easily from. It may save a lot of development time.

For development purposes, consider that you may be able to subset your data into a small enough form that you can still run and test your shiny app. Then this small subset you can keep in your R package. This is my current approach.

Both are good ideas! Thank you. I think the best option for me moving forward is hosting the data somewhere else.