Reputation: 191
My question is, how do I make counters where i can put stuff like doubles in them (yes, i did use LongValue but that gives me 0)?
Upvotes: 2
Views: 1607
Reputation: 6139
As a workaround you can do something like this
long convert = (long) (mydoubleVal * 10000);
context.getCounter(MyCounter.name1).setValue(convert);
And in Driver you can get the double values.
long c2 = job0.getCounters().findCounter(MyCounter.name1)
.getValue();
double getMyVal= (double) c2 / 10000;
Upvotes: 1
Reputation: 9844
Hadoop MapReduce job counters are by definition Java long
values. A MapReduce job implementation can obtain a handle to a Counter
through the TaskAttemptContext
.
After obtaining a handle to a Counter
, the job can either increment the counter by a delta or set it to a specific value.
http://hadoop.apache.org/docs/r2.7.1/api/org/apache/hadoop/mapreduce/Counter.html#increment(long)
http://hadoop.apache.org/docs/r2.7.1/api/org/apache/hadoop/mapreduce/Counter.html#setValue(long)
Notice that the method signatures all are specified in terms of long
. The domain model does not support usage of double
or any other data type as a counter value.
If it's absolutely necessary, then you could come up with some creative way to encode your data type into a long
. One way to do this would be to take advantage of the fact that both long
and double
are 64 bits wide. You could then use Double#doubleToLongBits
to encode the double
value as a long
.
http://docs.oracle.com/javase/7/docs/api/java/lang/Double.html#doubleToLongBits(double)
However, the only way to make sense of this later would be to write custom code that unpacks that counter value after the job finishes and passes it to Double#longBitsToDouble
.
http://docs.oracle.com/javase/7/docs/api/java/lang/Double.html#longBitsToDouble(long)
This would be a very unusual usage of Hadoop MapReduce job counters though.
Upvotes: 4