Downloading multiple files from a google drive sub-directory and saving to a specific local directory

I am looking to 'batch' download files from a particular google drive sub-directory. jennybryan has a great reprex and explanation. This will download all files to the current working directory. I want to sequester the downloads to temporary directory which is recycled at sessions end.

From jennybryan


## store the URL you have
folder_url <- ""

## identify this folder on Drive
## let googledrive know this is a file ID or URL, as opposed to file name
folder <- drive_get(as_id(folder_url))

## identify the csv files in that folder
csv_files <- drive_ls(folder, type = "csv")

## download them
walk(csv_files$id, ~ drive_download(as_id(.x)))

I'd like to change the target directory but return an error

## create the temporary directory
csv_dir <- tempdir()

## download the files
walk(csv_files$id, ~ drive_download(as_id(.x), path = csv_dir, overwrite = TRUE))

## error message
Error in curl::curl_fetch_disk(url, x$path, handle = handle) : 
  Failed to open file 

I have devised a work-around by setwd(csv_dir) running the walk() call, then setwd(old). I am curious why I fail when specifying the target directory in the call to walk. I can specify the target directory when calling drive_download() on a single file and execute successfully.


I haven't run any code, but I suspect you need to make a vector of full paths for the output files, e.g. like TEMPDIR/filename, then use walk2(csv_files$id, my_new_files, ~ drive_download(as_id(.x), path = .y overwrite = TRUE)) (treat that as pseudo-code).

I see what you're hoping for and it makes sense, you're just bumping into an infelicity of drive_download(). Somewhat related to another problem I've apparently experienced myself:

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.