Hi, we're in the process of switching our data warehouse to Bigquery and overall it's going great and I'm enjoying using the bigrquery package.
The only issue I have is that if I write a query that has a large result set (i.e 40m records) it takes about an hour to download the data which is dramatically slower than if I save the output as a csv in google cloud storage and curl the result manually.
I've tried with and without writing to an intermediate table and also setting a much larger page size but nothing seems to make a difference. Would be really interested to know if anybody else has had experience of this or has worked out a way round it.
Thanks
Jacob