john
john

Reputation: 31

getting error in importing spark dependencies in intellij idea

I am using intelli j idea with maven integration but I am getting error on following lines

import org.apache.spark.SparkConf;

import org.apache.spark.api.java.JavaRDD;

import org.apache.spark.api.java.JavaSparkContext;

import org.apache.spark.api.java.function.Function;

I am trying to run following example

package com.spark.hello;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.Function;

public class Hello {

      public static void main(String[] args) {
            String logFile = "F:\\Spark\\a.java"; 
            SparkConf conf = new SparkConf().setAppName("Simple Application");
            JavaSparkContext sc = new JavaSparkContext(conf);
            JavaRDD<String> logData = sc.textFile(logFile).cache();

            long numAs = logData.filter(new Function<String, Boolean>() {
              public Boolean call(String s) { return s.contains("a"); }
            }).count();

            long numBs = logData.filter(new Function<String, Boolean>() {
              public Boolean call(String s) { return s.contains("b"); }
            }).count();

            System.out.println("Lines with a: " + numAs + ", lines with b: " + numBs);
          }






}

plz help me to solve this issue or is there any other way to run this kind of project???

Upvotes: 2

Views: 1266

Answers (1)

hippoLogic
hippoLogic

Reputation: 304

Without seeing the error, I'm guessing the IDE is telling you they are unused imports be sure to double check the dependencies and the versions.

Alt + Enter is the shortcut I've used to resolve many of the issues.

Upvotes: 1

Related Questions