K.AJ
K.AJ

Reputation: 1292

NoSuchMethodError on a Map (Spark Scala class)

I am running Spark 1.4.1 with Scala 2.11 in a standalone mode on my local box. I have a the following ...

 object Parser {
    def main(args: Array[String]) {
        if (args.length < 6) {
            System.err.println("Usage: my.Parser <host> <input_loc> 
                                <input_dt> <match_term> <out_loc><file_type>")
            System.exit(1)
        }

        println(" *** Starting summariziation process *** ")

        var host : String = args(0)
        var inploc : String  = args(1)
        val inpdate : String  = args(2)
        val matchTerm : String  = args(3)
        val outloc : String = args(4)
        val fileType : String = args(5)

        println(" <------------------------------------------- debug ::0.0 ")

        val typesMap = Map("data" -> "data", "rider" -> "mon", "sms" -> "sms", "voice" -> "rec", "voucher" -> "vou")
        println( " typesMap - " + typesMap)
        .........
    }
}

When I run this code thru spark-shell it works just fine. But, running it thru spark-submit as a class object, I get weird behavior. I get the following error

 *** Starting summariziation process ***
 <------------------------------------------------- debug ::0.0
 Exception in thread "main" java.lang.NoSuchMethodError: 
     scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
    at my.Parser$.main(Parser.scala:138)

All I want is a simple look up to derive the file types to process.

It seems the line where I create a Map gives the error. I really stumped as to why it works within spark-shell and gives error with spark-submit.

Has anyone run into this issue? Can someone suggest how I fix it? Thank you for your help, in advance!

Upvotes: 4

Views: 3878

Answers (2)

Kshitij Kulshrestha
Kshitij Kulshrestha

Reputation: 2072

As Daniel Darabos said either build spark again for 2.11 or you can simply downgrade scala to 2.10

Upvotes: 0

Daniel Darabos
Daniel Darabos

Reputation: 27456

The prebuilt Apache Spark 1.4.1 downloads are for Scala 2.10. If you want to use Scala 2.11 you can build with -Dscala-2.11. See Building for Scala 2.11 in the documentation.

Upvotes: 3

Related Questions