Files are stored in S3 bucket and when tried to read file.info such as:
Last updated time
File size etc
couldnt because not sure how to enter the directory for list.files.
library(aws.s3)
file_list <- aws.s3::get_bucket(# complete bucket link including the folder name of the files in place)
# Transform info as df and extract Key info
file_names <- file_list %>% as.data.frame() %>% select(Key)
file_names # Could see in console all the file names stored in the bucket
lst_fl <- list.files( # Stuck here to give the path/directory
recursive = F,
pattern = "csv$", # extract files ends with csv
full.names = TRUE)
Your question is very convoluted and I don't really understand what the exact issue is. Can you rephrase the issue and create a reprex so we can test it and see where it goes wrong
I have not used the S3 bucket myself, but it seems this is more and issue with specifics of that package than something purely R right? Anyway, I found this post that might help, especially if you only need the meta data and not the file.
Thanks ... it seems it is impossible using list_files while dealing with S3 bucket
"@srane163 You can't use list.files() to see objects in S3 - that's only for local files. Your files object is a list of object keys. You can just lapply(files, get_object) to get a list of the objects (as raw vectors), which you can then parse using whatever you want."