Reputation: 47
I am using Spark with Hipi and bytedeco (opencv3.2.0) for doing processing images !
In my spark job, I try to load the data in an PairRDD and then process those images (calculate features) But when i try to load the hipi images bundle (.hib) as FloatImages i get an error !
CODE :
JavaPairRDD<HipiImageHeader,FloatImage> floatimages = jsc.newAPIHadoopFile("hdfs://cluster-1-m/user/ibi/sampleimages.hib",HibInputFormat.class,HipiImageHeader.class,FloatImage.class, new Configuration());
ERROR :
error: method newAPIHadoopFile in class JavaSparkContext cannot be applied to given types; JavaPairRDD floatimages = jsc.newAPIHadoopFile("hdfs://cluster-1-m/user/ibi/sampleimages.hib",HibInputFormat.class,HipiImageHeader.class,FloatImage.class, new Configuration()); ^ required: String,Class,Class,Class,Configuration found: String,Class,Class,Class,Configuration reason: inferred type does not conform to equality constraint(s) inferred: HipiImage equality constraints(s): HipiImage,FloatImage where F,K,V are type-variables: F extends InputFormat declared in method newAPIHadoopFile(String,Class,Class,Class,Configuration) K extends Object declared in method newAPIHadoopFile(String,Class,Class,Class,Configuration) V extends Object declared in method newAPIHadoopFile(String,Class,Class,Class,Configuration)
When i load them as HipiImages, I don't get this error ! but i don't know how to get the data from an hipi Image because there is no getData() and getPixelArray() methods in this class.
Can any of you tell me how to get the data from this HipiImage ? Or for who have already used Spark with HIPI and bytedeco api, how did they manage that ?
Upvotes: 1
Views: 438