Philip O'Brien
Philip O'Brien

Reputation: 4266

Apache Spark - JavaSparkContext cannot be converted to SparkContext error

I'm having considerable difficulty translating the Spark examples to runnable code (as evidenced by my previous question here).

The answers provided there helped me with that particular example, but now I am trying to experiment with the Multilayer Perceptron example and straight away I am encountering errors.

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.*;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.ml.classification.MultilayerPerceptronClassificationModel;
import org.apache.spark.ml.classification.MultilayerPerceptronClassifier;
import org.apache.spark.ml.evaluation.MulticlassClassificationEvaluator;
import org.apache.spark.ml.param.ParamMap;
import org.apache.spark.mllib.regression.LabeledPoint;
import org.apache.spark.mllib.util.MLUtils;
import org.apache.spark.mllib.linalg.Vectors;
import org.apache.spark.sql.DataFrame;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SQLContext;

// Load training data
public class SimpleANN {
  public static void main(String[] args) {
    String path = "file:/usr/local/share/spark-1.5.0/data/mllib/sample_multiclass_classification_data.txt";
    SparkConf conf = new SparkConf().setAppName("Simple ANN");
    JavaSparkContext sc = new JavaSparkContext(conf);
    JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(sc, path).toJavaRDD();
  ...
  ...
  }
}

I get the following error

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project simple-ann: Compilation failure
[ERROR] /Users/robo/study/spark/ann/src/main/java/SimpleANN.java:[23,61] incompatible types: org.apache.spark.api.java.JavaSparkContext cannot be converted to org.apache.spark.SparkContext

Upvotes: 1

Views: 5123

Answers (1)

vincent
vincent

Reputation: 1234

If you need a SparkContext from your JavaSparkContext you can use the static method :

JavaSparkContext.toSparkContext(youJavaSparkContextBean)

So you have to modify your code from

JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(sc, path).toJavaRDD();

To

JavaSparkContext sc = new JavaSparkContext(conf);
    JavaRDD<LabeledPoint> data = MLUtils.loadLibSVMFile(
                                    JavaSparkContext.toSparkContext(sc),
                                    path).toJavaRDD();

Upvotes: 4

Related Questions