K M Rakibul Islam
K M Rakibul Islam

Reputation: 34318

Hadoop error: type mismatch in write method

I just wrote a simple hadoop program where I am trying to encrypt a text file using AES algorithm. I read one by one line in my map method, encrypt that and write to the context. Pretty simple. I am doing the encryption in my map method and using the line offset as the key, so I don't need the reducer class.

Here is my code:

public class Enc {

public static class Map extends Mapper<LongWritable, Text, Text, IntWritable> {
private Text word = new Text();
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
            String strDataToEncrypt = new String();
            String strCipherText = new String();

            KeyGenerator keyGen = KeyGenerator.getInstance("AES");
            keyGen.init(128);
            SecretKey secretKey = keyGen.generateKey();

            Cipher aesCipher = Cipher.getInstance("AES");
            aesCipher.init(Cipher.ENCRYPT_MODE,secretKey);
            strDataToEncrypt = value.toString();

            byte[] byteDataToEncrypt = strDataToEncrypt.getBytes();
            byte[] byteCipherText = aesCipher.doFinal(byteDataToEncrypt); 
            strCipherText = new BASE64Encoder().encode(byteCipherText);
            System.out.println("cipher text: " +strCipherText);

                    String cipherString =  new String(strCipherText);
                    context.write(key, new Text(cipherString));

                }
    } 

    public static void main(String[] args) throws Exception {
        Configuration conf = new Configuration();

        Job job = new Job(conf, "Enc");
        job.setJarByClass(Enc.class);

        job.setOutputKeyClass(LongWritable.class);
        job.setOutputValueClass(Text.class);

        job.setMapperClass(Map.class);

        job.setInputFormatClass(TextInputFormat.class);
        job.setOutputFormatClass(TextOutputFormat.class);

        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));

        job.waitForCompletion(true);
    }        
}

I am getting the following error:

The method write(Text, IntWritable) in the type TaskInputOutputContext<LongWritable,Text,Text,IntWritable> is not applicable for the arguments (LongWritable, Text)

What am I missing here?

EDIT_1:

My final working code is here:

import java.io.IOException;
import java.util.*;

import org.apache.hadoop.fs.Path;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;

import javax.crypto.KeyGenerator;
import javax.crypto.SecretKey;
import javax.crypto.Cipher;

import java.io.BufferedReader;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.security.NoSuchAlgorithmException;
import java.security.InvalidKeyException;
import java.security.InvalidAlgorithmParameterException;

import javax.crypto.NoSuchPaddingException;
import javax.crypto.BadPaddingException;
import javax.crypto.IllegalBlockSizeException;

import sun.misc.BASE64Encoder;

public class Enc {

      public static class Map extends Mapper<LongWritable, Text, LongWritable, Text> {
        private Text word = new Text();
        public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {

            try {
            String strDataToEncrypt = new String();
            String strCipherText = new String();

            KeyGenerator keyGen = KeyGenerator.getInstance("AES");
            keyGen.init(128);
            SecretKey secretKey = keyGen.generateKey();

            Cipher aesCipher = Cipher.getInstance("AES");
            aesCipher.init(Cipher.ENCRYPT_MODE,secretKey);
            strDataToEncrypt = value.toString();

            byte[] byteDataToEncrypt = strDataToEncrypt.getBytes();
            byte[] byteCipherText = aesCipher.doFinal(byteDataToEncrypt); 
            strCipherText = new BASE64Encoder().encode(byteCipherText);
            System.out.println("cipher text: " +strCipherText);

            String cipherString =  new String(strCipherText);
            context.write(key, new Text(cipherString));
            }
            catch (NoSuchAlgorithmException noSuchAlgo)
            {
                System.out.println(" No Such Algorithm exists " + noSuchAlgo);
            }

                catch (NoSuchPaddingException noSuchPad)
                {
                    System.out.println(" No Such Padding exists " + noSuchPad);
                }

                    catch (InvalidKeyException invalidKey)
                    {
                        System.out.println(" Invalid Key " + invalidKey);
                    }

                    catch (BadPaddingException badPadding)
                    {
                        System.out.println(" Bad Padding " + badPadding);
                    }

                    catch (IllegalBlockSizeException illegalBlockSize)
                    {
                        System.out.println(" Illegal Block Size " + illegalBlockSize);
                    }


        }
    } 


    public static void main(String[] args) throws Exception {
        Configuration conf = new Configuration();

        Job job = new Job(conf, "Enc");
        job.setJarByClass(Enc.class);

        job.setOutputKeyClass(LongWritable.class);
        job.setOutputValueClass(Text.class);

        job.setMapperClass(Map.class);
        //job.setCombinerClass(Reduce.class);
        //job.setReducerClass(Reduce.class);


        job.setInputFormatClass(TextInputFormat.class);
        job.setOutputFormatClass(TextOutputFormat.class);

        FileInputFormat.addInputPath(job, new Path(args[0]));
        FileOutputFormat.setOutputPath(job, new Path(args[1]));

        job.waitForCompletion(true);
    }        
}

Upvotes: 1

Views: 5019

Answers (2)

WeiChing 林煒清
WeiChing 林煒清

Reputation: 4469

your code context.write(key, new Text(cipherString));has the wrong type in arguments. the argument of the write method is restricted to not only what you specified at job definition,which you have done right, but also the generic code part of mapper:

public static class Map extends Mapper<LongWritable, Text, Text, IntWritable>

,which is wrong of output types.( IDE is not smart enough to notice this error for you)

Change it to

public static class Map extends Mapper<LongWritable, Text, LongWritable, Text>

would resolve problem.

Upvotes: 3

Alex A.
Alex A.

Reputation: 2736

The first important bit here is

public class Map extends Mapper<LongWritable, Text, Text, IntWritable>

Which states that your Map class takes a LongWritable key and a Text value as input, and that it gives a Text key and an IntWritable value as output.

The second important bit is

context.write(key, new Text(cipherString));

This is where you give your output from the mapper. Key is of type LongWritable, and the second argument is a Text.

The problem here then is that there is a mismatch. You are claiming your mapper outputs a Text key and IntWritable value when extending Mapper, but what you actually output is a LongWritable and a Text. If you actually intended to output a LongWritable and a text, you should change your class declaration to be

public class Map extends Mapper<LongWritable, Text, LongWritable, Text>

Upvotes: 6

Related Questions