Hi,
I have a following problem. I have a Rscript that do web scraping of HTML table from a webpage. The script is here:
url <- "webpage.com/task1"
table <- url %>%
html() %>%
html_nodes(xpath='//*[@id="Form1"]/table[4]') %>%
html_table(fill = TRUE)
#save to dataframe
tableDF <- data.frame(matrix(unlist(table), ncol =lengths(table)) )
But I need to scrape this HTML from different URLs of this webpage. I have downloaded all html codes in a following table (csv format):
HTML URL
html_code_task1 webpage.com/task1
html_code_task2 webpage.com/task2
html_code_task3 webpage.com/task3
. .
. .
. .
My script works, when I manualy run it for example on webpage.com/task10.
But I would like to make R to go through all html_code (from task1 to task50) that I have in my table, save the values and finaly merge them together. Is it possible? Using a for loop, maybe?
Thank you very much for your help!