masmithd
masmithd

Reputation: 101

Create Spark DataFrame in Spark Streaming from JSON Message on Kafka

I'm working on an implementation of Spark Streaming in Scala where I am pull JSON Strings from a Kafka topic and want to load them into a dataframe. Is there a way to do this where Spark infers the schema on it's own from an RDD[String]?

Upvotes: 9

Views: 7825

Answers (4)

Brian
Brian

Reputation: 976

There is not schema inference on streaming. You can always read a file and pull the schema from it. You could also commit the file to version control and put it in a s3 bucket.

Upvotes: 0

radek1st
radek1st

Reputation: 1647

You can use the below code to read in the stream of messages from Kafka, extract the JSON values and convert them to DataFrame:

val messages = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, topicsSet)

messages.foreachRDD { rdd =>
//extracting the values only
  val df = sqlContext.read.json(rdd.map(x => x._2))
  df.show()
}

Upvotes: 1

Kiara Grouwstra
Kiara Grouwstra

Reputation: 5933

Yes, you can use the following:

sqlContext.read
//.schema(schema) //optional, makes it a bit faster, if you've processed it before you can get the schema using df.schema
.json(jsonRDD)  //RDD[String]

I'm trying to do the same at the moment. I'm curious how you got the RDD[String] out of Kafka though, I'm still under the impression Spark+Kafka only does streaming rather than "take out what's in there right now" one-off batch. :)

Upvotes: 3

sparklearner
sparklearner

Reputation: 403

In spark 1.4, you could try the following method to generate a Dataframe from rdd:

  val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
  val yourDataFrame = hiveContext.createDataFrame(yourRDD)

Upvotes: 2

Related Questions