hermi zied
hermi zied

Reputation: 55

Spark scala error

hey i tried to use spark with scala. When i try this code

    object d29 {
    def main(args: Array[String]){
    val name : String ="myspark"
    val master : String ="local[1]"
    val conf : SparkConf = new SparkConf().setAppName(name).setMaster(master)
    val  spContext : JavaSparkContext = new JavaSparkContext(conf)
    val file = spContext.textFile("zed/text.csv")
    val mapped = file.map(s=>s.length)
    }
}

i got this error for the s : missing parameter type

thank you

Upvotes: 2

Views: 499

Answers (2)

Balaji Reddy
Balaji Reddy

Reputation: 5700

object d29 {
def main(args: Array[String]){
val name : String ="myspark"
val master : String ="local[1]"
val conf : SparkConf = new SparkConf().setAppName(name).setMaster(master)
val  spContext = new SparkContext(conf)
val file = spContext.textFile("zed/text.csv")
val mapped = file.map(s=>s.length)
}

}

JavaSparkContext is Java wrapper of SparkContext. Since you are using Scala, no need wrapper.

Upvotes: 1

L. CWI
L. CWI

Reputation: 962

If you are using Scala, then do not create a JavaSparkContext. Use SparkContext instead :

val spContext: SparkContext = new SparkContext(conf)

Then your code will work.

If you are using Spark 2, use the new SparkSession instead of SparkContext.

Upvotes: 4

Related Questions