If you add this, it will do 50 records per page:
# Navigate to the page
consult$Page$navigate("http://www.css.gob.pa/p/grid_defensoria/")
################################
# New section
# Set the number of records to 50
consult$Runtime$evaluate('document.querySelector("#quant_linhas_f0_bot").value = 50')
consult$Runtime$evaluate('document.querySelector("#quant_linhas_f0_bot").dispatchEvent(new Event("change"))')
################################
# Initialize a tibble to store results
t <- tibble()
That will cut down the time quite a bit, so hopefully it helps. I would have expected the prior version to finish after running all night, but 
You could probably cut the wait time down to 3 seconds. Maaaaaaaybe 2, but at that point I think you really risk duplicating data you've pulled and risk putting too much traffic on their server, which is rude and could get you blocked. If I knew more javascript I could write something that waits until the site responds, which is the safest, but alas, I don't.
Put this in for the loop section, and it will write out a log as it goes. Also adjusts wait time to 3:
# Initialize a tibble to store results
t <- tibble()
i <- 0
cat("", file = "log.csv", append = FALSE)
# While the next button is clickable, scrape the table and click the "next"
# button. Wait 3 seconds between requests to be polite
while (next_enabled(consult)) {
t <- bind_rows(t, scrape_table(consult))
click_next(consult)
Sys.sleep(3)
i <- i + 1
cat(i, ",", consult$Runtime$evaluate('$("#sc_grid_toobar_bot > table > tbody > tr > td:nth-child(2)")[0].innerText')$result$value, "\n",
sep = "", file = "log.csv", append = TRUE)
}