I've been working in two different projects recently that have multiple files I'd like to import at once, so I'm trying to get a workflow down that makes importing and managing multiple data frames easier. I can currently import a batch of files with the following set of commands, but it's a little unwieldy. Particularly the last part.
# create string of the desired file names and locations files=paste0(here::here("Data", "Cognition data", "/"), list.files(path=here::here("Data", "Cognition data"), pattern = ".sav")) # Read in all data files simultaneously cog_data=files |> map(haven::read_sav) # find the file names; apply said names to the list levels names(cog_data)=file.path(here::here("Data", "Cognition data")) |> #specify file path as a string list.files(pattern = ".sav") |> # pass the path string to list files; search in this location for files with this extension gsub(pattern=".sav", replacement = "") # remove this pattern to save only the name # extract data frames from list to env list2env(cog_data, globalenv())
My question is if there's a better way to read the files in, so that their names are retained. My method reads them into a list with blank, unnamed levels, so I have to scrape their names and stitch them back in. Would love to hear suggestions on how to do this better!