Maciej
Maciej

Reputation: 1

Issue after Spark Installation on Windows 10

This is a cmd log that I see after running spark-shell command (C:\Spark>spark-shell). As I understand, it's mainly an issue with Hadoop. I use Windows 10. Can somehow please with the below issue?

C:\Users\mac>cd c:\
c:\>winutils\bin\winutils.exe chmod 777 \tmp\hive
c:\>cd c:\spark
c:\Spark>spark-shell


Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/05/14 13:21:25 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/05/14 13:21:34 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/c:/Spark/bin/../jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-rdbms-3.2.9.jar."
17/05/14 13:21:34 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/c:/Spark/bin/../jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-core-3.2.10.jar."
17/05/14 13:21:34 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/c:/Spark/bin/../jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/C:/Spark/jars/datanucleus-api-jdo-3.2.6.jar."
17/05/14 13:21:48 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
Spark context Web UI available at http://192.168.1.9:4040
Spark context available as 'sc' (master = local[*], app id = local-1494764489031).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.1.1
      /_/

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_131)
Type in expressions to have them evaluated.
Type :help for more information.

Upvotes: 0

Views: 504

Answers (1)

Jacek Laskowski
Jacek Laskowski

Reputation: 74669

There's no issue in your output. These WARN messages can simply be ignored.

In other words, it looks like you've installed Spark 2.1.1 on Windows 10 properly.

To make sure you installed it properly (so I could remove looks from above sentence) is to do the following:

spark.range(1).show

That by default will trigger loading Hive classes that may or may not end up with exceptions on Windows due to Hadoop's requirements (and hence the need for winutils.exe to work them around).

Upvotes: 1

Related Questions