Reputation: 3099
I am trying to run simple data write to ElasticSearch example. However, I keep getting this error:
EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only
My dependencies for Spark and ElasticSearch:
scalaVersion := "2.11.5"
val sparkVersion = "2.3.0"
resolvers += "Spark Packages Repo" at "http://dl.bintray.com/spark-packages/maven"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"com.typesafe" % "config" % "1.3.0",
"org.elasticsearch" %% "elasticsearch-spark-20" % "6.2.4"
)
Here is my code for an example:
object App {
def main(args: Array[String]) {
val sparkConf = new SparkConf()
.setMaster(args(0))
.setAppName("KafkaSparkStreaming")
sparkConf.set("es.index.auto.create", "true")
val sparkSession = SparkSession
.builder()
.config(sparkConf)
.getOrCreate()
val streamingContext = new StreamingContext(sparkSession.sparkContext, Seconds(3))
val sparkContext = streamingContext.sparkContext
sparkContext.setLogLevel("ERROR")
val sqlContext = new SQLContext(sparkContext)
val numbers = Map("one" -> 1, "two" -> 2, "three" -> 3)
val airports = Map("arrival" -> "Otopeni", "SFO" -> "San Fran")
sparkContext.makeRDD(Seq(numbers, airports)).saveToEs("spark/docs")
streamingContext.start()
streamingContext.awaitTermination()
}
}
I run ElasticSearch with docker image. It is my docker-compose.yml file:
version: '3.3'
services:
kafka:
image: spotify/kafka
ports:
- "9092:9092"
environment:
- ADVERTISED_HOST=localhost
elasticsearch:
image: elasticsearch
kibana:
image: kibana
ports:
- "5601:5601"
What might cause this exception? I would really appreciate some help.
Upvotes: 1
Views: 3754
Reputation: 36
You can edit your spark config by adding your ES hostname:
sparkConf.set("es.index.auto.create", "true")
sparkConf.set("es.nodes", "your_elasticsearch_ip")
sparkConf.set("es.port", "9200")
sparkConf.set("es.nodes.wan.only", "true")
You can also try to forward you ES port in your Docker-compose file:
elasticsearch:
image: elasticsearch
ports:
- "9200:9200"
If it is not work, maybe it is a probleme with the Spark connector so you can redirect your calls to ES to your local:
In your docker-compose add this command:
elasticsearch:
image: elasticsearch
command: "apt install -y socat && socat tcp-listen:9200,fork tcp:your_elasticsearch_ip:9200 &"
Or
command: "apt install -y socat && socat tcp-listen:9200,fork tcp:localhost:9200 &"
socat will forward your local 9200 port to your remote elasticsearch 9200 port.
Upvotes: 0
Reputation: 88
I faced a similar situation while trying to experiment spark with elasticsearch, replacing "elasticsearch-spark" dependency with "elasticsearch-hadoop" to meet my elasticsearch ver. solved the problem
val conf = new SparkConf().setAppName("Sample").setMaster("local[*]")
conf.set("es.index.auto.create", "true")
val sc = new SparkContext(conf)
val ssc = new StreamingContext(sc, Seconds(10))
val numbers = Map("one" -> 1, "two" -> 2, "three" -> 3)
val airports = Map("arrival" -> "Otopeni", "SFO" -> "San Fran")
val rdd = sc.makeRDD(Seq(numbers, airports))
val microbatches = mutable.Queue(rdd)
ssc.queueStream(microbatches).saveToEs("spark/docs")
ssc.start()
ssc.awaitTermination()
dependency list
"org.apache.spark" %% "spark-core" % "2.2.0",
"org.apache.spark" %% "spark-sql" % "2.2.0",
"org.apache.spark" %% "spark-streaming" % "2.2.0",
"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.3.1",
"org.elasticsearch" %% "elasticsearch-hadoop" % "6.3.0",
Upvotes: 1