How to read fileinfo from the aws s3 bucket?

Hello Experts,

Files are stored in S3 bucket and when tried to read file.info such as:
Last updated time
File size etc
couldnt because not sure how to enter the directory for list.files.

library(aws.s3)
file_list <- aws.s3::get_bucket(# complete bucket link including the folder name of the files in place)

# Transform info as df and extract Key info
file_names <- file_list %>% as.data.frame() %>% select(Key)
file_names # Could see in console all the file names stored in the bucket

lst_fl <- list.files( # Stuck here to give the path/directory
                            recursive = F,
                            pattern = "csv$", # extract files ends with csv
                            full.names = TRUE)

Hi,

Your question is very convoluted and I don't really understand what the exact issue is. Can you rephrase the issue and create a reprex so we can test it and see where it goes wrong

Succes

Thanks for reply.
Actually, key issue is that ...
How can we give the directory name for list.files when files are stored in s3 bucket ?

Also, updated the question

Hi,

I have not used the S3 bucket myself, but it seems this is more and issue with specifics of that package than something purely R right? Anyway, I found this post that might help, especially if you only need the meta data and not the file.

Grtz

1 Like

Thanks ... it seems it is impossible using list_files while dealing with S3 bucket

"@srane163 You can't use list.files() to see objects in S3 - that's only for local files. Your files object is a list of object keys. You can just lapply(files, get_object) to get a list of the objects (as raw vectors), which you can then parse using whatever you want."

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.