Large temporary files persist after being read in with read_csv

I'm reading in a number of large (compressed) csv files using read_csv. The files are successfully being pulled into R, but leave behind very large files in my %tmp% directory. Is there a way to clean these up automatically from within R?

Thanks for your help and the free software!

If this is the /tmp directory in macOS or a Linus, those will automatically be purged on reboot.

Sorry, it is on windows 10. It is especially problematic on a shared Cloud PC.

I stopped using Windows around Vista, so I won't speculate. This post addresses a similar issue.

Not sure about this, but would

file.remove(list.files(tempdir()))

after each read do the trick?

1 Like

Thank you. I'll look into using vroom directly.

I've run into this issue when asking R to run multiple imports which each use lapply to read and filter multiple files within a function. I wonder if I've prevented R from correctly destroying the objects.

@startz That might work if I set a different temp dir for R to use, I don't want to purge all of %tmp% and screw up other users

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.