Ahmed
Ahmed

Reputation: 65

Initialize public static variable in Hadoop through arguments

I have a problem with changing public static variables in Hadoop. I am trying to pass some values as arguments to the jar file from command line.

here is my code:

public class MyClass {
  public static long myvariable1 = 100;
  public static class Map extends Mapper<Object, Text, Text, Text> {
    public static long myvariabl2 = 200;
    public void map(Object key, Text value, Context context) throws IOException, InterruptedException {

    }
  }
  public static class Reduce extends Reducer<Text, Text, Text, Text> {
    public void reduce(Text key, Iterable<Text> values, Context context)
    throws IOException, InterruptedException {

    }
  }
  public static void main(String[] args) throws Exception {
    col_no = Long.parseLong(args[0]);
    Map.myvariable1 = Long.parseLong(args[1]);
    Map.myvariable2 = Long.parseLong(args[1]);
    other stuff here
  }
}

But it is not working, myvariable1 & myvaribale2 always have 100 & 200. I use Hadoop 0.20.203 with Ubuntu 10.04

Upvotes: 3

Views: 1661

Answers (1)

Matt D
Matt D

Reputation: 3095

What you can do to get the same behavior is to store your variables in the Configuration you use to launch the job.

public static class Map extends Mapper<Object, Text, Text, Text> {
  public void map(Object key, Text value, Context context) throws IOException, InterruptedException {

    Configuration conf = context.getConfiguration();
    String var2String = conf.get("myvariable2");
    long myvariable2 = Long.parseLong(var2String);
    //etc.
  }
}

public static void main(String[] args) throws Exception {
  col_no = Long.parseLong(args[0]);
  String myvariable1 = args[1];
  String myvariable2 = args[1];

  // add values to configuration
  Configuration conf = new Configuration();
  conf.set("myvariable1", myvariable1);
  conf.set("myvariable2", myvariable2);

  //other stuff here
}

Upvotes: 4

Related Questions