bq_auth() to authenticate bigrquery using service account on a Linux server

We are trying to authenticate to BigQuery in an R script scheduled to run on a linux server. We have the following line of code pointing to a valid service account for our bigquery db on GCP:

bigrquery::bq_auth(path = '~/path-to/credentials/our-creds.json')

and when I run this locally in R, I get this prompt

The bigrquery package is requesting access to your Google account.
Select a pre-authorised account or enter '0' to obtain a new token.
Press Esc/Ctrl + C to cancel.

1: myemail@gmail.com

Selection: 

When we run this on our linux server with Rscript our_script.R to run the whole script, we get the following error:

Error: Can't get Google credentials.
Are you running bigrquery in a non-interactive session? Consider:
  * Call `bq_auth()` directly with all necessary specifics.
Execution halted

When we run this on our linux server via R to open up R, and then manually running the bq_auth() line of code, we get:

> bigrquery::bq_auth(path = '~/path-to/credentials/our-creds.json')
Waiting for authentication in browser...
Press Esc/Ctrl + C to abort

however because we don't have a browser to authenticate in (we are SSH'd into our linux server which is on GCP), the terminal just hangs here... I guess we could try to find someway to locally access the server's browser? But that doesn't seem right. We need to schedule this script to run daily, so we cannot do manual authentication by pulling up a browser for the server and clicking to authenticate...

How can we successfully authenticate on the server, so Rscript out_script.R can run successfully? Info on our server if it helps:

enter image description here

I can share additional info if this helps... note that we also tried adding myemail@gmail.com to bq_auth's email parameter, but this did not resolve the issue. We see there is a token parameter and perhaps we can manually add a token in somewhere? This is all super frustrating because I thought using the service account key was supposed to resolve this issue, and yet we are pointing to a valid service account key with bigquery permissions and yet still cannot authenticate on the server...

A few troubleshooting ideas:

... R script scheduled to run on a linux server....

Whenever I hear "scheduled", I think "cron" and when I hear "cron", I think "path problems". So in the server / schedule context, I recommend doing very pedantic checks that you truly understand what working directory is, who the user is, and whether a path is resolving to what you intend.

However, the fact that this service account token doesn't even seem to work for you in a local interactive session suggests something is wrong with the file. Historically, people have tried to use other JSON files, such as the JSON for an OAuth client, as if it were the JSON for a service account, which does not work, for obvious reasons.

You can gain more insight into how this is failing and falling through to interactive auth by using the gargle verbosity option as described here:

And in case you have not already found it (and for the benefit of other readers) here is a general overview of how to set up non-interactive auth:

Yes we are scheduling this script to run on cron, and we have double checked the path which is correct. (we are successfully loading a CSV file in the same directory as the JSON credentials file).

I hadn't even processed that it wasn't working locally! I will troubleshoot the service account and update soon.

It was a faulty service account key after all. Thanks a ton!

1 Like

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.