Unable to Query Hive Tables(on S3) from R-Studio

Below is the command I am running and the error I see:

#spark <- spark_connect()
spark <- spark_connect(master = "yarn")
tbl_change_db(spark, 'prod_pps_nih_batch')
rxt<- dbGetQuery (spark, 'select * from dx_p360data_2020121011565282992 limit 10')
Error: java.io.IOException: Invalid region specified: US; Region can be configured with fs.s3a.s3guard.ddb.region: us-gov-west-1, us-gov-east-1, us-east-1, us-east-2, us-west-1, us-west-2, eu-west-1, eu-west-2, eu-west-3, eu-central-1, eu-north-1, ap-east-1, ap-south-1, ap-southeast-1, ap-southeast-2, ap-northeast-1, ap-northeast-2, sa-east-1, cn-north-1, cn-northwest-1, ca-central-1
*** at org.apache.hadoop.fs.s3a.s3guard.DynamoDBClientFactory$DefaultDynamoDBClientFactory.getRegion(DynamoDBClientFactory.java:114)***
*** at org.apache.hadoop.fs.s3a.s3guard.DynamoDBClientFactory$DefaultDynamoDBClientFactory.createDynamoDBClient(DynamoDBClientFactory.java:85)***

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.