Hey @kaushiklakshman,
I would like to kindly ask you to always provide your full code whenever you are asking for help on a forum. For example, it would greatly help if you also added how your page variable was created 
The solution to your issue can be found in the possibly() function in the purrr package. It's a great function, which enables your code to keep running even if an error is encountered in the process. The following code scrapes all 15 tables on the link your provided.
library(dplyr)
library(rvest)
library(purrr)
link <- "https://www.thegreyhoundrecorder.com.au/results/forbury-park/63442"
xpaths <- paste0('//*[@id="race-', 1:15, '"]/table[2]')
scrape_table <- function(link, xpath){
link %>%
read_html() %>%
html_nodes(xpath = xpath) %>%
html_table() %>%
flatten_df %>%
setNames(c("plc", "name_box", "trainer", "time", "mgn", "split", "in_run", "wgt", "sire", "dam", "sp"))
}
scrape_table_possibly <- possibly(scrape_table, otherwise = NULL)
scraped_tables <- map(xpaths, ~ scrape_table_possibly(link = link, xpath = .x))
The scraped_tables variable is a list of 15 elements (one of each table). The second element of the list is a NULL as specified in the otherwise argument in the possibly() function.
Hope this helps.