Reputation: 53
New to Spark. Just trying to make simple word count program, but at last step counts.take(10), getting below error.
input_file=sc.textFile("C:/Users/abhishek.vij/Desktop/abhi.txt")
map = input_file.flatMap(lambda line: line.split(" ")).map(lambda word: (word, 1))
counts = map.reduceByKey(lambda a, b: a + b)
counts.take(10)
19/10/26 21:14:42 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) org.apache.spark.SparkException: Python worker exited unexpectedly (crashed)
Upvotes: 1
Views: 2020
Reputation: 235
val counts = input_file.flatMap(line => line.split(" ")
).map(word => (word,1)).reduceByKey(_+_) counts.collect
I think this will work for you
Upvotes: 1