AM01
AM01

Reputation: 314

Spark Custom AccumulatorV2

I have defined a Custom Accumulator as:

import org.apache.spark.util.LongAccumulator

class CustomAccumulator extends LongAccumulator with java.io.Serializable  {
  override def add(v: Long): Unit = {
    super.add(v)
    if (v % 100 == 0) println(v)
  }
}

And registered it as:

val cusAcc = new CustomAccumulator
sc.register(cusAcc, "customAccumulator")

My issue is that when I try to use it as:

val count = sc.customAccumulator  

I get the following error:

<console>:51: error: value customAccumulator is not a member of org.apache.spark.SparkContext
   val count = sc.customAccumulator

I am new to Spark and scala, and maybe missing something very trivial. Any guidance will be greatly appreciated.

Upvotes: 1

Views: 3968

Answers (2)

Dedkov Vadim
Dedkov Vadim

Reputation: 436

Since Spark 2.0.0 you should use method register in abstract class AccumulatorV2: org.apache.spark.util.AccumulatorV2#register.

Something like this:

cusAcc.register(sc, scala.Option("customAccumulator"), false);

Upvotes: 0

user8375305
user8375305

Reputation: 11

According to the Spark API, AccumulatorV2 is no longer under package org.apache.spark.SparkContext; instead, it has been moved to org.apache.spark.util.

Upvotes: 1

Related Questions