Reputation: 11627
I am new to Spark
and I have an issue with serialization inside map
function. Here's some element of the code
private Function<Row, String> SparkMap() throws IOException {
return new Function<Row, String>() {
public String call(Row row) throws IOException {
/* some code */
}
};
}
public static void main(String[] args) throws Exception {
MyClass myClass = new MyClass();
SQLContext sqlContext = new SQLContext(sc);
DataFrame df = sqlContext.load(args[0], "com.databricks.spark.avro");
JavaRDD<String> output = df.javaRDD().map(myClass.SparkMap());
}
here's the error log
Caused by: java.io.NotSerializableException: myPackage.MyClass
Serialization stack:
- object not serializable (class: myPackage.MyClass, value: myPackage.MyClass@281c8380)
- field (class: myPackage.MyClass$1, name: this$0, type: class myPackage.MyClass)
- object (class myPackage.MyClass$1, myPackage.MyClass$1@28ef1bc8)
- field (class: org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1, name: fun$1, type: interface org.apache.spark.api.java.function.Function)
- object (class org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1, <function1>)
at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47)
at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:81)
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:312)
... 12 more
If I declare static the SparkMap
method, then it runs. How can it be
Upvotes: 2
Views: 1058
Reputation: 67135
The exception is pretty explanatory:
object not serializable (class: myPackage.MyClass, value: myPackage.MyClass@281c8380)
Simply make your MyClas
s Serializable
and it will work.
It works as a static because it only takes the function in that case, not the entire myClass
object
Upvotes: 2