Reputation: 61
I'm using spark-streaming-kafka checkpoints to store processed Kafka offsets into a folder in HDFS, after restarting the application (using spark-submit) in order to check recovery, I'm getting a ClassNotFoundException on a class that belongs to spark-streaming-kafka module and is packed into my application uber jar. Seems like the class is not looked up in my application jar.
Using v1.5.1
15/12/02 15:42:30 INFO streaming.CheckpointReader: Attempting to load checkpoint from file hdfs://ip-xxx-xx-xx-xx:8020/user/checkpoint-1449064500000
15/12/02 15:42:30 WARN streaming.CheckpointReader: Error reading checkpoint from file hdfs://ip-xxx-xx-xx-xx:8020/user/checkpoint-1449064500000
java.io.IOException: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka.OffsetRange
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1163)
at org.apache.spark.streaming.DStreamGraph.readObject(DStreamGraph.scala:188)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.apache.spark.streaming.Checkpoint$$anonfun$deserialize$2.apply(Checkpoint.scala:151)
at org.apache.spark.streaming.Checkpoint$$anonfun$deserialize$2.apply(Checkpoint.scala:141)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1206)
at org.apache.spark.streaming.Checkpoint$.deserialize(Checkpoint.scala:154)
at org.apache.spark.streaming.CheckpointReader$$anonfun$read$2.apply(Checkpoint.scala:329)
at org.apache.spark.streaming.CheckpointReader$$anonfun$read$2.apply(Checkpoint.scala:325)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:35)
at org.apache.spark.streaming.CheckpointReader$.read(Checkpoint.scala:325)
at org.apache.spark.streaming.StreamingContext$.getOrCreate(StreamingContext.scala:852)
...
Upvotes: 1
Views: 928
Reputation: 61
Update: found out there is an open bug on this - SPARK-5569 (https://github.com/apache/spark/pull/8955).
After applying the code change in the suggested commit and building spark-assembly it now works.
Upvotes: 1