Hello,
I like to add that if R can hardly handle writing the file, Excel will hang trying to load it and manipulations will be nearly impossible on such a large set.
Why do you want to do anything in excel with it? R is much better at handling data manipulations of such a scale...
If you want to save data without the need for excel, I suggest you use the RDS format to save data in a compressed state. Excel can't read it, but R can and it saves a lot of disk space. See example here:
#Generate large dataset
n = 5000000
myData = data.frame(x = sample(LETTERS, n, replace = T),
y = sample(1:1000, n, replace = T),
z = runif(n))
#Save as csv
#-----------
write.csv(myData, "testFile.csv")# About 175mb
#Save as rds
#-----------
saveRDS(myData, "testFile.rds") #About 40mb
#Read rds
#-----------
loadData = readRDS("testFile.rds")
Hope this helps,
PJ