Richard Liu
Richard Liu

Reputation: 203

spark launch : find version

My environment is Windows 7, and scala 2.11.4 installed (works well), Java 1.8

I have tried spark-1.2.0-bin-hadoop2.4 and spark-1.2.1-bin-hadoop2.4 and each time I put

bin\spark-shell.cmd

I just got the error from Windows:

find: 'version': No such file or directory
else was unexpected at this time.

Is there anything I ignored here?

Thank you so much.

updated: (from spark-class2.cmd)

C:\Users\spark-1.2.1-bin-hadoop2.4>for /F "tokens=3" %i in ('java -version 2>&1 | find "version"') do set jversi on=%i
find: 'version': No such file or directory
else was unexpected at this time.

and if I try java -version, it seems working on java side

C:\Users\spark-1.2.1-bin-hadoop2.4>java -version
java version "1.8.0_31"
Java(TM) SE Runtime Environment (build 1.8.0_31-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.31-b07, mixed mode)

Upvotes: 1

Views: 3029

Answers (3)

Ihor B.
Ihor B.

Reputation: 1363

I've solved this problem. Here is my solution.

I have cygwin installed and PATH system variable pointed to C:\cygwin64\bin\ where there is a find.exe.

Therefore, the line in spark-class2.cmd

for /f "tokens=3" %%i in ('java -version 2^>^&1 ^| find "version"') do set jversion=%%i

did not use the proper "find" executable.

Changing this line to this

for /f "tokens=3" %%i in ('java -version 2^>^&1 ^| C:\Windows\System32\find.exe "version"') do set jversion=%%i

fixed my problem.

Upvotes: 3

busybug91
busybug91

Reputation: 241

My friend had the same problem. We figured that it had something to do with java. So we re-installed java-sdk (1.7) and checked if path and JAVA_HOME are properly set. After that spark-shell.cmd and spark-submit.cmd worked fine.

Platform: Windows 8.

Upvotes: 0

Richard Liu
Richard Liu

Reputation: 203

I just realize the codes there was trying to figure out the version of JVM, and since I know it 1.8.0_31. Here is my silly solution:

rem Set JAVA_OPTS to be able to load native libraries and to set heap size
rem for /f "tokens=3" %%i in ('java -version 2^>^&1 ^| %windir%\system32\FIND.exe "version"') do echo %%i
rem for /f "tokens=1 delims=_" %%i in ("%jversion:~1,-1%") do set jversion=%%i
rem if "%jversion%" geq "1.8.0" (
  set JAVA_OPTS=%OUR_JAVA_OPTS% -Xms%OUR_JAVA_MEM% -Xmx%OUR_JAVA_MEM%
rem ) else (
rem   set JAVA_OPTS=-XX:MaxPermSize=128m %OUR_JAVA_OPTS% -Xms%OUR_JAVA_MEM% -Xmx%OUR_JAVA_MEM%
rem )

and I think the Spark team needs to work on this

rem Set JAVA_OPTS to be able to load native libraries and to set heap size for /f "tokens=3" %%i in ('java -version 2^>^&1 ^| %windir%\system32\FIND.exe "version"') do echo %%i for /f "tokens=1 delims=_" %%i in ("%jversion:~1,-1%") do set jversion=%%i

Upvotes: 0

Related Questions