pythonic
pythonic

Reputation: 21625

How to read multiple line elements in Spark?

When you read a file in Spark using sc.textfile, it gives you elements, where each element is a separate line. However, I want each element to consist of N number of lines. I can't use delimiters either because there is none in that file. So, how can I make spark give me multiple line elements?

And I'm interested in doing so using the NLineInputFormat class. Is that possible to do so in Spark? I can see examples of that for MapReduce, but I don't have any clue how that would translate to in Spark.

Upvotes: 4

Views: 926

Answers (1)

Mateusz Dymczyk
Mateusz Dymczyk

Reputation: 15141

Yes, if you are getting the files from hadoop. You should be able to do it like this:

val records = sc.newAPIHadoopRDD(hadoopConf,classOf[NLineInputFormat],classOf[LongWritable],classOf[Text])

Here's the API doc.

Upvotes: 2

Related Questions