Reputation: 15
I have placed a mp4 file on hdfs and trying to analyze it directly i have a class name as VideoRecordReader
in which it gives the casting error. Below is the description of Error.
You have loaded library /usr/local/lib/libopencv_core.so.3.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now. attempt_201607261400_0011_m_000000_1: It's highly recommended that you fix the library with 'execstack -c ', or link it with '-z noexecstack'. 16/07/26 17:32:27 INFO mapred.JobClient: Task Id : attempt_201607261400_0011_m_000000_2, Status : FAILED java.lang.ClassCastException: org.apache.hadoop.mapreduce.lib.input.FileSplit cannot be cast to org.apache.hadoop.mapred.FileSplit at com.finalyearproject.VideoRecordReader.initialize(VideoRecordReader.java:65) at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:521) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190) at org.apache.hadoop.mapred.Child.main(Child.java:249)
Here is the code of SplitFile.
public void initialize(InputSplit genericSplit, TaskAttemptContext context)
throws IOException, InterruptedException {
FileSplit split = (FileSplit) genericSplit;
Configuration job = context.getConfiguration();
start = 0;
end = 1;
final Path file = split.getPath();
FileSystem fs = file.getFileSystem(job);
fileIn = fs.open(split.getPath());
filename = split.getPath().getName();
byte [] b = new byte[fileIn.available()];
fileIn.readFully(b);
video = new VideoObject(b);
}
kindly help me thank u best regards.
Upvotes: 1
Views: 660
Reputation: 13927
Its likely you're mixing the mapred
and mapreduce
APIs together.
Its complaining that you're trying to cast org.apache.hadoop.mapreduce.lib.input.FileSplit
to org.apache.hadoop.mapred.FileSplit
.
You need to make sure that you generally dont mix imports between the two APIs.
So check if the org.apache.hadoop.mapred.FileSplit
has been imported and change it to org.apache.hadoop.mapreduce.lib.input.FileSplit
.
Upvotes: 1