NickyPatel
NickyPatel

Reputation: 555

spark scala code in scala ide is not showing run as scala application

I am a spark scala developer but I was facing some weird problem. when I was trying to execute scala code with the main method, it was not showing me the option to run as scala application.

I was completely clueless because generally, it happens when there is no main method but the issue is the main method also there.

code as below.

package org.apache.spark.examples.sql

import org.apache.spark.sql.{Encoder, Encoders, SparkSession}
import org.apache.spark.sql.expressions.Aggregator

// scalastyle:off println
object SimpleTypedAggregator {

  def main(args: Array[String]): Unit = {
    val spark = SparkSession
      .builder
      .master("local[*]")
      .appName("common typed aggregator implementations")
      .getOrCreate()
      spark.sparkContext.setLogLevel("ERROR")

    import spark.implicits._
    val ds = spark.range(20).select(('id % 3).as("key"), 'id).as[(Long, Long)]
    println("input data:")
    ds.show()

Upvotes: 0

Views: 198

Answers (2)

NickyPatel
NickyPatel

Reputation: 555

The package declaration which i was doing, was not correct. Suprised !!!! Eclipse is not showing about the wrong package.

But it is giving this issue which i mentioned above.

Anyways it will help anybody if they are getting the same issue like me. And they need not to spend time like me to find the silly thing.

Thanks all for whatever you did to solve this problem. :)

Upvotes: 1

Changjun
Changjun

Reputation: 11

Are you using IntelliJ IDEA? Add FrameWork Support.. , then check the Scala check box

Upvotes: 1

Related Questions