Reputation:
Would there have been a difference in performance if C#'s built-in data types were implemented the same way that Java's primitive data types are implemented?
For example, in C#, if we write the following code:
int foo = 1;
we are declaring and instantiating an object, right?
However, in Java, when we write the same piece of code, we are not instantiating an object. Since 'int
' is "implemented directly by the Java compiler and the JVM."
So I was wondering if it would have made a difference if C# had used the same implementation as Java -- i.e. by not making an object every time a primitive data type like int
is used in C#.
Upvotes: 0
Views: 280
Reputation: 1062650
For example, in C#, if we write the following code:
int foo = 1;
we are declaring and instantiating an object, right?
Nope; you are declaring either a local value-type on the stack, or a value-type instance field. Zero objects were involved. int
is not an object-type, and Int32
and int
are the exact same thing in .NET.
(Actually, there is a scenario where foo
can end up causing an extra object - related to how locals in some methods are handled; aysnc
, iterator blocks, lambdas, etc.)
by not making an object every time a primitive data type like int is used in C#.
That's exactly what C# does, and has always done. More: unlike Java, this works with generics too.
Upvotes: 5