Failed to find 'spark-submit2.cmd'

> library('BBmisc')
> library('sparklyr')
> sc <- spark_connect(master = 'local')
- Error in start_shell(master = master, spark_home = spark_home, spark_version = version,  : 
-   Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.
> spark_home_dir()
[1] "C:\\Users\\Owner\\AppData\\Local/spark/spark-3.0.0-bin-hadoop2.7"
> spark_installed_versions()
  spark hadoop                                                              dir
1 3.0.0    2.7 C:\\Users\\Owner\\AppData\\Local/spark/spark-3.0.0-bin-hadoop2.7
> spark_home_set()
Setting SPARK_HOME environment variable to C:\Users\Owner\AppData\Local/spark/spark-3.0.0-bin-hadoop2.7
> sc <- spark_connect(master = 'local')
- Error in start_shell(master = master, spark_home = spark_home, spark_version = version,  : 
-   Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.

source : https://github.com/englianhu/binary.com-interview-question/issues/1#issue-733943885

May I know how to how to solve Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.?

Reference :

Solved !!!
Step :

  1. https://spark.apache.org/downloads.html
  2. extract zipped file to 'C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2'.
  3. manually choose latest version : spark_home_set('C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2')

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.