Reputation: 42276
Is the use of implicit enum fields to represent numeric values a necessarily bad practice?
Here is a use case: I want an easy way to represent hex digits, and since C# enums are based on integers, they seem like a natural match. I don't like a char
or a string
here, because I have to explicitly validate their values. The problem with enums is that digits [0-9]
are not valid field identifiers (with good reason). It occurred to me that I don't need to declare the digits 0-9
, because they are implicitly present.
So, my hex digit enum would look like:
public enum Hex : int {
A = 10,
B = 11,
C = 12,
D = 13,
E = 14,
F = 15
}
So, I could write Tuple<Hex,Hex> r = Tuple.Create(Hex.F,(Hex)1);
, and r.Item1.ToString() + r.Item2.ToString()
would give me "F1". Basically, my question is that if the ToString()
value of the numeric constant is what I want to name the enum field, why is it problematic to omit the declaration entirely?
An alternative representation as an enum could have the fields declared with some prefix, such as:
public enum Hex : int {
_0 = 0,
_1 = 1,
_2 = 2,
_3 = 3,
_4 = 4,
_5 = 5,
_6 = 6,
_7 = 7,
_8 = 8,
_9 = 9,
A = 10,
B = 11,
C = 12,
D = 13,
E = 14,
F = 15
}
The problem is that the above example would give me "F_1" instead of "F1". Obviously, this is easy to fix. I'm wondering if there are additional problems with the implicit approach that I am not considering.
Upvotes: 15
Views: 1034
Reputation: 1071
In my opinion, it is a bad practice. If you need Hex representation, simply create a helper class that handles all the operations you require.
As this article suggests, these code snippets will help in creating the helpers:
// Store integer 182
int decValue = 182;
// Convert integer 182 as a hex in a string variable
string hexValue = decValue.ToString("X");
// Convert the hex string back to the number
int decAgain = int.Parse(hexValue, System.Globalization.NumberStyles.HexNumber);
The reason I believe it's bad practice is because it's not object oriented, and it runs into the problem of relying on the enum to translate all the values hard-coded - which is bad. If you can avoid hard coding anything, it's always a step in the right direction. Also, a helper class is extensible and can be improved over time for additional functionality.
That being said, I DO like the simplicity of enums, but, again, that doesn't supersede the need to keep things OO in my opinion.
Upvotes: 1
Reputation: 517
I don't particularly see why you would want to do this, but you could use the Description attribute on each of your enum values to get rid of the _ and create some kind of static function that allows you to get one of your enum values easily like Hex(15) -> 'F'.
public enum Hex {
[Description("0")] _0 = 0,
...
}
Upvotes: 0
Reputation: 13030
I'm not sure what you're actually trying to accomplish here, but if you're looking to limit something to two hexadecimal digits, why wouldn't you just declare it as a byte? While your enum hack is clever, I don't actually see the need for it. It's also likely to be misundertstood if passed to another programmer without explanation as your use of undeclared values against your enum is counterintuitive.
Regarding number bases and literal representations, an integer in computing isn't base-10 or base-16 natively, it's actually base-2 (binary) under the covers and any other represenations are a human convenience. The language already contains ways to represent literal numbers in both decimal and hexadecimal format. Limiting the number is a function of appropriately choosing the type.
If you are instead trying to limit something to any arbitrary even quantity of hexadecimal digits, perhaps simply initializing a byte array like this would be more appropriate:
byte[] hexBytes = new byte[3] { 0xA1, 0xB2, 0xC3 };
Also, by keeping your value as a regular numeric type or using a byte array rather than putting it into Tuples with enums, you retain simple access to a whole range of operations that otherwise become more difficult.
Regarding limiting your numbers to arbitrary odd quantities of hexadecimal digits, you can choose a type that contains at least your desired value + 1 digit and constrain the value at runtime. One possible implementation of this is as follows:
public class ThreeNibbleNumber
{
private _value;
public ushort Value
{
get
{
return _value;
}
set
{
if (value > 4095)
{
throw new ArgumentException("The number is too large.");
}
else
{
_value = value;
}
}
}
public override string ToString()
{
return Value.ToString("x");
}
}
In one of your comments on another answer, you reference the idea of doing CSS colors. If that's what you desire a solution like this seems appropriate:
public struct CssColor
{
public CssColor(uint colorValue)
{
byte[] colorBytes = BitConverter.GetBytes(colorValue);
if (BitConverter.IsLittleEndian)
{
if (colorBytes[3] > 0)
{
throw new ArgumentException("The value is outside the range for a CSS Color.", "s");
}
R = colorBytes[2];
G = colorBytes[1];
B = colorBytes[0];
}
else
{
if (colorBytes[0] > 0)
{
throw new ArgumentException("The value is outside the range for a CSS Color.", "s");
}
R = colorBytes[1];
G = colorBytes[2];
B = colorBytes[3];
}
}
public byte R;
public byte G;
public byte B;
public override string ToString()
{
return string.Format("#{0:x}{1:x}{2:x}", R, G, B).ToUpperInvariant();
}
public static CssColor Parse(string s)
{
if (s == null)
{
throw new ArgumentNullException("s");
}
s = s.Trim();
if (!s.StartsWith("#") || s.Length > 7)
{
throw new FormatException("The input is not a valid CSS color string.");
}
s = s.Substring(1, s.Length - 1);
uint color = uint.Parse(s, System.Globalization.NumberStyles.HexNumber);
return new CssColor(color);
}
}
Upvotes: 0
Reputation: 21688
It's bad practice because it's a clever trick that's surprising to the people who read your code. It surprised me that it actually worked, it had me saying wtf. Remember the only valid measurement of code quality:
Clever tricks don't belong in code that's meant to be read and maintained by others. If you want to output a number as hex, convert it to a hex string using the normal String.Format("{0:X}", value)
Upvotes: 14
Reputation: 942177
This is a fundamentally broken way to handle hex. Hex is a human interface detail. It is always a string, a representation of a number. Like "1234" is a representation of the value 1234. It happens to be "4D2" when represented in hex but the number in your program is still 1234. A program should only ever concern itself with the number, never with the representation.
Converting a number to hex should only happen when you display the number to human eyes. Simple to do with ToString("X"). And to parse back from human input with TryParse() using NumberStyles.HexNumber. Input and output, at no other point should you ever deal with hex.
Upvotes: 7
Reputation:
I would define a struct
for HexDigit
. You can add HexDigit
'A' to 'F' as static constants (or static readonly fields).
You can define implicit converters to allow conversion of integers 0-9, conversion to integers, and you can override ToString() to make you Tuples look nice.
That will be much more flexible than an enum.
Upvotes: 3