Reputation: 15
Exception in thread "main" java.lang.Error: Unresolved compilation problem: Type mismatch: cannot convert from Iterator to Iterable
at com.spark.wordcount.lession1.WordCount2.main(WordCount2.java:26)
SparkConf conf = new SparkConf().setAppName("cust data").setMaster("local[*]");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> lines = sc.textFile("C:\\\\Users\\\\dell\\\\Desktop\\\\simple_text_file.txt");
JavaRDD<String> words = lines.flatMap(s -> Arrays.asList(SPACE.split(s)).iterator());
JavaPairRDD<String, Integer> ones = words.mapToPair(s -> new Tuple2<>(s, 1));
JavaPairRDD<String, Integer> counts = ones.reduceByKey((i1, i2) -> i1 + i2);
List<Tuple2<String, Integer>> output = counts.collect();
for (Tuple2<?,?> tuple : output) {
System.out.println(tuple._1() + ": " + tuple._2());
}
Upvotes: 1
Views: 3840
Reputation: 35229
You are mixing incompatible versions of Spark / code:
FlatMapFunction.call
is java.util.Iterator<R> call(T t)
FlatMapFunction.call
is Iterable<R> call(T t)
.You should either upgrade Spark dependency to 2.x and keep your current code or use FlatMapFunction
compatible with 1.x branch:
JavaRDD<String> words = lines.flatMap(new FlatMapFunction<String, String>() {
@Override
public Iterable<String> call(String s) {
return Arrays.asList(SPACE.split(s));
}
});
Upvotes: 3
Reputation: 3563
The problem should already be visible in your (Eclipse?) IDE by the red squiqly lines and the warning that your source contains errors before you actually execute the program. Eclipse nicely lets you run anyway and includes code that will throw the 'Unresolved compilation problem' if you hit the code that has a problem.
The exception indicates that you are passing an Iterator<>
while the calling method is expecting an Iteratable<>
, that is an object that implements that interface and has a method iterator()
which returns an Iterator
.
Upvotes: 0