Rstudio to hadoop cluster connection to read hive database tables

We are facing issues while connecting cluster.Please check the image

It's incredibly hard to read what's happening based on the screenshot.

Could you please turn this into a self-contained reprex (short for reproducible example)? It will help us help you if we can be sure we're all working with/looking at the same stuff.

Right now the best way to install reprex is:

# install.packages("devtools")
devtools::install_github("tidyverse/reprex")

If you've never heard of a reprex before, you might want to start by reading the tidyverse.org help page. The reprex dos and don'ts are also useful.

What to do if you run into clipboard problems

If you run into problems with access to your clipboard, you can specify an outfile for the reprex, and then copy and paste the contents into the forum.

reprex::reprex(input = "fruits_stringdist.R", outfile = "fruits_stringdist.md")

For pointers specific to the community site, check out the reprex FAQ, linked to below.

If for some reason you absolutely can't create a reprex, please add code formatting to your post in order to make it more legible. To do so, you just add triple backticks around the code chunk (and add an R next to the opening backticks for syntax highlighting):

```r
library(foo)
```

Hi

Thank you so much your response

Here is our error with code

dotenv::load_dot_env(file = "C:\opt\mapr\spark.env")
JAVA_HOME_BIN <- paste0(Sys.getenv("JAVA_HOME"), "\bin")
SPARK_HOME_BIN <- paste0(Sys.getenv("SPARK_HOME"), "\bin")
HADOOP_HOME_BIN <- paste0(Sys.getenv("HADOOP_HOME"), "\bin")
MAPR_HOME_SERVER <- paste0(Sys.getenv("MAPR_HOME"), "\server")
PATH_ORG <- Sys.getenv("PATH")
Sys.setenv("PATH"=paste(PATH_ORG, JAVA_HOME_BIN, SPARK_HOME_BIN, HADOOP_HOME_BIN, MAPR_HOME_SERVER, sep = ";"))
library(sparklyr)
config <- list("sparklyr.log.console" = TRUE,spark.env.SPARK_LOCAL_IP = "10.21.1.162", spark.env.SPARK_PUBLIC_DNS = "10.21.1.162", spark.env.SPARK_LOCAL_HOSTNAME= "10.21.1.162")
sc <- spark_connect("yarn-client", config = config, version = "2.0.1")

.Error

Error in force(code) :

Failed while connecting to sparklyr to port (8880) for sessionid (45174): Gateway in port (8880) did not respond.

Path: C:\opt\mapr\spark\spark-2.0.1\bin\spark-submit2.cmd

Parameters: --class, sparklyr.Shell, "C:\Users\bmotolentino\Documents\R\win-library\3.4\sparklyr\java\sparklyr-2.0-2.11.jar", 8880, 45174

Log: C:\Users\BMOTOL~1\AppData\Local\Temp\Rtmp4ufsbA\file30645aae2cf2_spark.log

---- Output Log ----

Travel plan

Planned Leave:NIL

Public Holiday:NIL

(((dotenv::load_dot_env(file = "C:\opt\mapr\spark.env")
JAVA_HOME_BIN <- paste0(Sys.getenv("JAVA_HOME"), "\bin")
SPARK_HOME_BIN <- paste0(Sys.getenv("SPARK_HOME"), "\bin")
HADOOP_HOME_BIN <- paste0(Sys.getenv("HADOOP_HOME"), "\bin")
MAPR_HOME_SERVER <- paste0(Sys.getenv("MAPR_HOME"), "\server")
PATH_ORG <- Sys.getenv("PATH")
Sys.setenv("PATH"=paste(PATH_ORG, JAVA_HOME_BIN, SPARK_HOME_BIN, HADOOP_HOME_BIN, MAPR_HOME_SERVER, sep = ";"))
library(sparklyr)
config <- list("sparklyr.log.console" = TRUE,spark.env.SPARK_LOCAL_IP = "10.21.1.162", spark.env.SPARK_PUBLIC_DNS = "10.21.1.162", spark.env.SPARK_LOCAL_HOSTNAME= "10.21.1.162")
sc <- spark_connect("yarn-client", config = config, version = "2.0.1")
)))
Error

(((Error in force(code) :
Failed while connecting to sparklyr to port (8880) for sessionid (45174): Gateway in port (8880) did not respond.
Path: C:\opt\mapr\spark\spark-2.0.1\bin\spark-submit2.cmd
Parameters: --class, sparklyr.Shell, "C:\Users\bmotolentino\Documents\R\win-library\3.4\sparklyr\java\sparklyr-2.0-2.11.jar", 8880, 45174
Log: C:\Users\BMOTOL~1\AppData\Local\Temp\Rtmp4ufsbA\file30645aae2cf2_spark.log

---- Output Log ----
)))