I don't use Google Cloud, so no definitive answer, but some possibilities. First, the question is to identify the actual method that is called, it's collect.tbl_BigQueryConnection() whose source code you can find here. First, note the options:
collect.tbl_BigQueryConnection <- function(x, ...,
page_size = NULL,
max_connections = 6L,
n = Inf,
warn_incomplete = TRUE)
So you could try decreasing max_connections and increasing page_size (which is what the error message suggested).
Then, if you look at the function source code, you see there is a bit of boilerplate, and at the end this call:
out <- bq_table_download(tb,
n_max = n,
page_size = page_size,
quiet = quiet,
max_connections = max_connections,
bigint = bigint
)
So the actual work of downloading the data is actually done by bq_table_download, with the arguments being passed down (explains why the error message is mentioning it). Let's look at the manual of that function:
This retrieves rows in chunks of page_size . It is most suitable for results of smaller queries (<100 MB, say). For larger queries, it is better to export the results to a CSV file stored on google cloud and use the bq command line tool to download locally.
So, I don't have the knowledge of GC to tell you how to do that, but considering 60M rows, if each has 5 numeric (1 Byte/value) column that's about 6e6*5*8 = 240 MB you're totally in the second situation.