Raj
Raj

Reputation: 2398

Eclipse Autocomplete not suggesting the method in Spark/Scala

I am newbie in Scala & writing a word count program to find the no of occurrences of each unique word in a file using Spark API. Find below the code

val sc = new SparkContext(conf)

//Load Data from File
val input = sc.textFile(args(0))

//Split into words
val words = input.flatMap { line => line.split(" ") }

//Assign unit to each word
val units = words.map { word => (word, 1) }

//Reduce each word
val counts = units.reduceByKey { case (x, y) => x + y }
...

Although the Application compiles successfully, The issue I have is, when I type units. in Eclipse, the autocomplete is not suggesting the method reduceByKey. For other functions the Autocomplete works perfect. Is there any specific reason for this?

Upvotes: 0

Views: 371

Answers (1)

Justin Pihony
Justin Pihony

Reputation: 67075

This is probably due to reduceByKey only being available via implicits. That method is not on the RDD, but on PairRDDFunctions. I had thought that implicit autocompletion was working in eclipse...but I would guess that to be your issue. You can verify by explicitly wrapping units in a PairRDDFunctions

Upvotes: 2

Related Questions