Cannot connect RStudio Desktop to Spark through sparklyr

I am trying to connect to a local master through sparklyr. My code is :slight_smile:

pacman::p_load(sparklyr, sparknlp, tidyverse)
options(sparklyr.log.console = TRUE)
sc <- spark_connect(master = "local", version = "3.0.0")

An error message is returned, saying Error in spark_connect_gateway(gatewayAddress, gatewayPort, sessionId, : Gateway in localhost:8880 did not respond. Then I printed out the log, which looks like below:

Exception in thread "main" java.lang.ExceptionInInitializerError
	at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
	at org.apache.spark.internal.config.package$.<init>(package.scala:1006)
	at org.apache.spark.internal.config.package$.<clinit>(package.scala)
	at org.apache.spark.deploy.SparkSubmitArguments.$anonfun$loadEnvironmentArguments$3(SparkSubmitArguments.scala:157)
	at scala.Option.orElse(Option.scala:447)
	at org.apache.spark.deploy.SparkSubmitArguments.loadEnvironmentArguments(SparkSubmitArguments.scala:157)
	at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:115)
	at org.apache.spark.deploy.SparkSubmit$$anon$2$$anon$3.<init>(SparkSubmit.scala:990)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.parseArguments(SparkSubmit.scala:990)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:85)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed module @13545af8
	at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:357)
	at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
	at java.base/java.lang.reflect.Constructor.checkCanSetAccessible(Constructor.java:188)
	at java.base/java.lang.reflect.Constructor.setAccessible(Constructor.java:181)
	at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:56)
	... 13 more

I did install Java updates and figured out there may be some issue related to Spark configuration, but I don't know what to do next.

I am on Mac OS 12.3, RStudio 2021.09.1. Any help would be much appreciated.

Hi! would you mind sharing the output of this command? system2("java", "--version")

I think it may have to do with you installed Java version

Hi! Thanks for your response. The output is :

java 16.0.2 2021-07-20
Java(TM) SE Runtime Environment (build 16.0.2+7-67)
Java HotSpot(TM) 64-Bit Server VM (build 16.0.2+7-67, mixed mode, sharing)

Does this version seem suitable for Spark?

Oh ok, can you try using Java 11 JDK?

Hi! I installed Java 11 JDK and set it as Java home on RStudio. And now I can connect to Spark! Really amazing! Thank you very much for your advice!

In case someone else may have similar issues like I, I want to post how I solved this issue. I am on Mac, so open the terminal and change the Java version from 16 to 11:

#  Check java versions installed
ls /Library/Java/JavaVirtualMachines/
# Choose on version from the above ouput and reset the java home
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk-11.jdk/Contents/Home
# Check current java home
echo $JAVA_HOME

Then in RStudio, reset the Java home:

# Reset the home
Sys.setenv(JAVA_HOME = "the directory returned by `echo $JAVA_HOME`")
# Check current java version, should be 11
system2("java", "--version")
1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.