Reputation: 17676
How can I port a java inner function from here which fully is contained in to Scala?
JavaPairRDD<Envelope, HashSet<Point>> castedResult = joinListResultAfterAggregation.mapValues(new Function<HashSet<Geometry>,HashSet<Point>>()
{
@Override
public HashSet<Point> call(HashSet<Geometry> spatialObjects) throws Exception {
HashSet<Point> castedSpatialObjects = new HashSet<Point>();
Iterator spatialObjectIterator = spatialObjects.iterator();
while(spatialObjectIterator.hasNext())
{
castedSpatialObjects.add((Point)spatialObjectIterator.next());
}
return castedSpatialObjects;
}
});
return castedResult;
My approach as outlined below would not compile due to some NotinferredU
val castedResult = joinListResultAfterAggregation.mapValues(new Function[java.util.HashSet[Geometry], java.util.HashSet[Point]]() {
def call(spatialObjects: java.util.HashSet[Geometry]): java.util.HashSet[Point] = {
val castedSpatialObjects = new java.util.HashSet[Point]
val spatialObjectIterator = spatialObjects.iterator
while (spatialObjectIterator.hasNext) castedSpatialObjects.add(spatialObjectIterator.next.asInstanceOf[Point])
castedSpatialObjects
}
})
Upvotes: 0
Views: 86
Reputation: 170723
When asking a question about compilation errors please provide the exact error, especially when your code doesn't stand on its own.
The inner function itself is fine; my guess would be that due to changes above joinListResultAfterAggregation
isn't a JavaPairRDD
anymore, but a normal RDD[(Envelope, Something)]
(where Something
could be java.util.HashSet
, scala.collection.Set
or some subtype), so its mapValues
takes a Scala function, not a org.apache.spark.api.java.function.Function
. Scala functions are written as lambdas: spatialObjects: Something => ...
(the body will depend on what Something
actually is, and the argument type can be omitted in some circumstances).
Upvotes: 1
Reputation: 1029
How about this ?
val castedResult = joinListResultAfterAggregation.mapValues(spatialObjects => {
spatialObjects.map(obj => (Point) obj)
})
Upvotes: 1