ajayramesh
ajayramesh

Reputation: 3774

How to sort list based on its value in spark java?

I want to sort the homes based on this price in ascending order .

public class Home implements Serializable{ 

        private double price = Math.Random() * 1000; 

    }

This is how I am doing it in the sequentially .

ArrayList<Home> city; // Assume it is initallized with some values
Arrays.sort(this.city,new Comparator<House>(){
                public int compare (House o1, House o2){
                if (o1.getPrice() > o2.getPrice()) {
                    return -1;
                } else if (o1.getPrice() < o2.getPrice()) {
                    return 1;
                }
                return 0;
            }
            });

Now I want to sort it using Apache Spark Java .

Method one

  JavaRDD<House> r2 = houseRDD.sortBy(  i -> {return i.getPrice(); }, true, 1 );

Method two:

JavaRDD<House> r = populationRDD.sortBy( new Function<House, Double>() {
                private static final long serialVersionUID = 1L;

                @Override
                public Double call(Individual value ) throws Exception {
                    return  value.getPrice();
                }

            }, true, 1 );

Whats wrong in above methods, I am not getting below exception -

java.lang.ClassCastException: House cannot be cast to java.lang.Comparable

java.lang.ClassCastException: House cannot be cast to java.lang.Comparable
    at org.spark_project.guava.collect.NaturalOrdering.compare(NaturalOrdering.java:28)
    at scala.math.LowPriorityOrderingImplicits$$anon$7.compare(Ordering.scala:153)
    at scala.math.Ordering$$anon$4.compare(Ordering.scala:111)
    at org.apache.spark.util.collection.Utils$$anon$1.compare(Utils.scala:35)
    at org.spark_project.guava.collect.Ordering.max(Ordering.java:551)
    at org.spark_project.guava.collect.Ordering.leastOf(Ordering.java:667)
    at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
    at org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$30.apply(RDD.scala:1393)
    at org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$30.apply(RDD.scala:1390)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
    at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:785)
    at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
    at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
    at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
    at org.apache.spark.scheduler.Task.run(Task.scala:86)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)

As per the comment the new Home Class

public class Home implements Serializable,Comparable<Home>{ 

        private double price = Math.Random() * 1000; 

@Override
    public int compareTo(House o) {
        return Double.compare(this.getPrice(),o.getPrice());
    }
    }

Upvotes: 2

Views: 3205

Answers (2)

ajayramesh
ajayramesh

Reputation: 3774

List<Home> homes; // initialize it  with some data
JavaRDD<Individual> homeRDD =  SparkUtil.getSparkContext().parallelize(homes);
   public class Home implements Serializable,Comparable<Home>{ 

        private double price = Math.Random() * 1000; 

         @Override
        public int compareTo(Home o) {
            return Double.compare(this.getPrice(),o.getPrice());
        }
    }

Now Try same code

JavaRDD<House> houseRDD = houseRDD.sortBy(  i -> {return i.getPrice(); }, true, 1 );

houseRDD.top(4); // this will output top 4 houses

Upvotes: 0

recurf
recurf

Reputation: 235

It seems House does not implement the comparable interface.

How to implement the Java comparable interface?

Upvotes: 1

Related Questions