Hi,
I have the following code, which reads in data from an excel file named "sitn". It then merges and filters this with a table named "all_sum" and performs some basic calculations before exporting it.
I would like to adapt this code so that it will read in tables (each with identical formatting) from each worksheet in "sitn" and perform the same calculations and formatting, but then export a file for worksheet.
Basically, it will do the same as it does now, but save a lot of time as I don't have to rerun the process each time.
itn <- read.csv(sitn,stringsAsFactors = FALSE)
# first, convert relevant character data to numeric
jtdata$N <- as.numeric(jtdata$N)
jtdata$Av_jt <- as.numeric(jtdata$Av_jt)
# then calculate the average for each link and time period
all_sum <- group_by(jtdata,Select_t,link_id) %>%
summarise(AVJT_all = weighted.mean(Av_jt,N)/100,
obs_all = sum(N))
# merge in the ITN data
all_sum <- merge(all_sum,itn[,c("Link_id","Classifica","Lnklength")],by.x = "link_id", by.y = "Link_id",all.x = TRUE) %>%
filter(Classifica!="N/A")
# add in speed
all_sum <- mutate(all_sum,
speed = (Lnklength/1000)/(AVJT_all/3600))
write.csv(all_sum, "JT_Averages_by_link.csv", row.names = F, quote = F)
Thanks.