FatalError
FatalError

Reputation: 964

Spark reduceByKey(arguments) in the type {type} is not applicable for the arguments {arguments}

I am getting following error, for reduce, fold and reduceByKey functions. I searched across the forums but no luck.

The method reduceByKey(Function2) in the type JavaPairRDD is not applicable for the arguments (new Function2(){})

termsRDD is of type JavaPairRDD<Integer, Integer>

    termsRDD.reduceByKey(new Function2<Integer,Integer,Integer>(){

        private static final long serialVersionUID = -376611514417758310L;

        @Override
        public Integer apply(Integer arg0, Integer arg1) throws Exception {
            // TODO Auto-generated method stub
            return arg0+arg1;
        }

    });

It's a fairly simple code that sum up the values against the keys separately.

Any help is much appreciated. Thanks

Upvotes: -1

Views: 224

Answers (1)

Alper t. Turker
Alper t. Turker

Reputation: 35219

You implemented wrong interface.

Should be

import org.apache.spark.api.java.function.Function2;

termsRDD.reduceByKey(new Function2<Integer,Integer,Integer>(){
    @Override
    public Integer call(Integer arg0, Integer arg1) throws Exception {
        return arg0+arg1;
    }
});

Upvotes: 1

Related Questions