giri
giri

Reputation: 27199

if(null!=variable) why not if(variable!=null)

Hi In our company they follow a strict rule of comparing with null values. When I code if(variable!=null) in code review I get comments on this to change it to if(null!=variable). Is there any performance hit for the above code? If anybody explains highly appreciated.

Thanks in advance

Upvotes: 4

Views: 15706

Answers (4)

Karl von Moor
Karl von Moor

Reputation: 8614

This has nothing to do with performance. It's used to prevent that you assign accidentally instead of comparing. An assignment null = var won't make any sense. But in Java var = null also won't compile so the rule of turning them around doesn't make sense anymore and only makes the code less readable.

Upvotes: 3

user330315
user330315

Reputation:

It's a "left-over" from old C-coding standards.

the expression if (var = null) would compile without problems. But it would actually assign the value null to the variable thus doing something completely different. This was the source for very annoying bugs in C programs.

In Java that expression does not compile and thus it's more a tradition than anything else. It doesn't erver any purpose (other than coding style preferences)

Upvotes: 3

JB Nizet
JB Nizet

Reputation: 691655

I don't see any advantage in following this convention. In C, where boolean types don't exist, it's useful to write

if (5 == variable)

rather than

if (variable == 5)

because if you forget one of the eaqual sign, you end up with

if (variable = 5)

which assigns 5 to variable and always evaluate to true. But in Java, a boolean is a boolean. And with !=, there is no reason at all.

One good advice, though, is to write

if (CONSTANT.equals(myString))

rather than

if (myString.equals(CONSTANT))

because it helps avoiding NullPointerExceptions.

My advice would be to ask for a justification of the rule. If there's none, why follow it? It doesn't help readability.

Upvotes: 11

Joe Enos
Joe Enos

Reputation: 40393

No performance difference - the reason is that if you get used to writing (null == somevar) instead of (somevar == null), then you'll never accidentally use a single equals sign instead of two, because the compiler won't allow it, where it will allow (somevar = null). They're just extending this to != to keep it consistent.

I personally prefer (somevar == null) myself, but I see where they're coming from.

Upvotes: 3

Related Questions