Using R studio on Databricks

I need help trying to figure out how to use packages in Rstudio on my laptop. I must use my government issued computer, and I was not allowed to download Rtools alongside Rstudio. All downloads are done by admins and we are not allowed to download anything on our own.

I am now trying to use Rstudio on Databricks. In order to run analyses on the online RStudio through databricks, you must connect to Spark R. I keep getting this error:

SparkR::sparkR.session()
Spark package found in SPARK_HOME: /databricks/spark
Launching java with spark-submit command /databricks/spark/bin/spark-submit sparkr-shell /tmp/RtmpU8QHTS/backend_port131e5e52a27d
Error: Could not find or load main class org.apache.spark.launcher.Main
/databricks/spark/bin/spark-class: line 101: CMD: bad array subscript
Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, :
JVM is not ready after 10 seconds

I am thinking I need to find a way to connect data bricks to my desktop Rstudio, which will hopefully solve the problem, but I am not sure if it will just end up blocking me again.

I would love any advice on how to download packages in R studio Desktop WITHOUT Rtools. Thank you!

hi dear

please have a look on r studio server cloud :

hope this would be helpfull for you

Kind Regards

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.