The way using 'for' loop in simple syntax

I coded like this :

rate_1 = aggregated_rate %>% select(ID, LENGTH, rate1)
write.csv(rate_1, "~")
rate_2 = aggregated_rate %>% select(ID, LENGTH, rate2)
write.csv(rate_2, "~")
...
rate_9 = aggregated_rate %>% select(ID, LENGTH, rate9)
write.csv(rate_9, "~")

I want to convert this long code into just 2 lines with 'for' loop. I guess this would be very basic for programmers...but I haven't studied yet so I would be appreciate if you help me :slight_smile:

I don't think "~" is a valid filename to write to so I am doubting your starting code...
That aside, I would prefer to structure the code like this (assuming library(tidyverse))

walk(paste0("rate",c(1:2,9)),
     ~{aggregated_rate %>% 
    select(ID, LENGTH, all_of(.x)) %>% 
  write.csv( "~")})

Thank you, but sorry this code doesn't work. When I initiate the code, it returns this error :

Error in if (file == "") file <- stdout() else if (is.character(file)) { : the condition has length > 1

With this issue, I want to be confirmed some marginal things.
I appologize not had written down specific conditions.

First of all, I want to perform task for 1-9, 9 files, not just for 1, 2, and 9(I guess "c(1:2,9)" in your code seems like this). In this situation, isn't it right using c(1:9) instead?

Furthermore, I want to save each output with name of "rate_", and in file directory "C:/Users/Documents".
For example, if there were no loop and I would have intended output "rate_1", I would type like this :

write.csv(rate_1, "C:/Users/Documents/rate_1")
But I want to do this task for all 9 files.

So...I would be very appreciate if you let me how to deal these issues... :slightly_frowning_face:

walk(1:9,
     ~{aggregated_rate %>% 
    select(ID, LENGTH, all_of(paste0("rate",.x))) %>% 
  write.csv( paste0("rate_",.x))})

It worked. thank you!

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.