Can I run a BigQuery SQL query and then continue wrangling the data using dbplyr?

In another project working with Amazon Athena I could do this:

con <- DBI::dbConnect(odbc::odbc(), Driver = "path-to-driver", 
                 S3OutputLocation = "location", 
                 AwsRegion = "eu-west-1", AuthenticationType = "IAM Profile", 
                 AWSProfile = "profile", Schema = "prod")


tbl(con,
    # Run SQL query
    sql('SELECT *
    FROM TABLE')) %>%
  # Without having collected the data, I could further wrangle the data inside the database
  # using dplyr code
  select(var1, var2) %>%
  mutate(var3 = var1 + var2)

However, now using BigQuery I get the following error

con <- DBI::dbConnect(bigrquery::bigquery(),
                      project = "project")

tbl(con,
    sql(
      'SELECT *
    FROM TABLE'
    ))

> Error: dataset is not a string (a length one character vector).

Any idea if with BigQuery is not possible to do what I'm trying to do?

Hi, I believe that's because tbl() is not expecting a sql() object, but a regular string variable. Can you try passing simply the name of the table you wish to assign use? Such as tbl(con, "my_table"). For more into, here is the link to the dplyr section of BigRQuery: An Interface to Googles BigQuery API' • bigrquery