Vijay_Shinde
Vijay_Shinde

Reputation: 1352

Getting Error while initializing sparkR : JVM is not ready after 10 seconds

I'm currently testing one application using sparkR. This are the my platform & application details:

Platform: Windows server 2008 SparkR version : R version 3.1.2 (2014-10-31) Spark Verion : version 1.4.1

What I did?

Step I: Load package into R environment

library(SparkR) -- Working

Step II: Set the system environment variables

Sys.setenv(SPARK_HOME = "C:\hdp\spark-1.4.1-bin-hadoop2.6") -- Working .libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"),.libPaths()))

Step III: Create a spark context and a SQL context

sc <- sparkR.init(master = "local",sparkHome = "C:\hdp\spark-1.4.1-bin-hadoop2.6",appName = "TestSparR")

Getting error at this line i.e JVM is not ready after 10 seconds

Please, help me resolve this issue. Thanks.

Upvotes: 1

Views: 5759

Answers (2)

rad15f
rad15f

Reputation: 29

This worked for me

sparkPath <- 'C:/Users/YOUR PATH'
Sys.setenv(SPARK_HOME=sparkPath)
.libPaths(c(file.path(Sys.getenv('SPARK_HOME'), 'R', 'lib'), .libPaths()))
library(SparkR)
library(sparklyr)
sc <- spark_connect(master='local')

Upvotes: 0

quim
quim

Reputation: 61

I had the same problem, and I can tell you I tried many many things.

But finally the following worked for me, after restarting my computer (and R and RStudio by the way):

SPARK_HOME <- "C:\\Apache\\spark-1.5.2-bin-hadoop2.6\\"
Sys.setenv('SPARKR_SUBMIT_ARGS'='"--packages" "com.databricks:spark-csv_2.10:1.2.0" "sparkr-shell"')
library(SparkR, lib.loc = "C:\\Apache\\spark-1.5.2-bin-hadoop2.6\\R\\lib")

library(SparkR)
library(rJava)

sc <- sparkR.init(master = "local", sparkHome = SPARK_HOME)

Maybe this can help: after restarting the system this was included in my environment variables PATH:

C:\ProgramData\Oracle\Java\javapath

Upvotes: 2

Related Questions