Primitive data types and portability in Java

I quote from Herbert Schildt Chapter 3 Data types, Variables and Arrays :

The primitive types represent single values not complex objects. Although Java is otherwise completely object-oriented, the primitive types are not. The reason for this efficiency. Making the primitive types would have degraded performance too much.

The primitive types are defined to have an explicit range and mathematical behavior. Languages such as C, C++ allow the size of an integer to vary based upon the dictates of the execution environment. However, Java is different. Because of Java’s portability requirement, all data types have a strongly defined range. For example, an int is always 32-bit regardless of the particular platform. This allows programs to be written that are guaranteed to run without porting on any machine architecture. While strictly specifying the size of an integer may cause a small loss of performance in some environments, it is necessary in order to achieve portability.

What does he mean by the last 2 lines ? And how come specifying the size of an integer may cause a small loss of performance in some environments?

Upvotes: 6

Views: 950

Answers (4)

Mureinik
Mureinik

Reputation: 311188

In "lower" languages, primitive data types sizes are often derived from the CPU's ability to handle them. E.g., in , an int is defined as being "at least 16 bits in size", but its size may vary between architectures in order to assure that "The type int should be the integer type that the target processor is most efficient working with." (source). This means that if your code makes careless assumptions about an int's size, it may very well break if you port it from 32-bit to 64-bit .

, as noted above, is different. An int, e.g., will always be 32 bits. This means you don't have to worry about its size changing when you run the same code on a different architecture. The tradeoff, as also mentioned above, is performance - on any architecture that doesn't natively handle 32 bit calculations, these ints need to be expanded to the native size the CPU can handle (which will have a small penalty), or worse, if the CPU can only handle smaller ints, every operation on an int may require several CPU operations.

Upvotes: 9

user949300
user949300

Reputation: 15729

A few (very few) computers use 36 bit architecture, so you need an extra step to mask off bits, simulate overflows, etc.

Upvotes: 3

Maleen Abewardana
Maleen Abewardana

Reputation: 14562

AFAIK, there is no way in java to define the size of an integer. It is always an 32 bit int as mentioned here. But some programming languages may allow to specify the size of the integer (Ex: Ada).

The performance issue comes when compiler try to convert our code to machine code instructions (See here and here). Normally machine code instructions are 32 or 64 bits. If our ints are same as the size in machine code, it is easy to convert them into the machine code. Otherwise compiler needs to put an extra effort to covert them into the machine code. That's when the performance issue comes.

Upvotes: 2

Ravi Trivedi
Ravi Trivedi

Reputation: 2360

Java implements its own data pointer mechanism on the top of the underlying system' pointer mechanism. Lot of the systems may use smaller pointers if the data is not large enough.

For ex: If your integer data only requires 16 bit pointer, system will only allocate required storage where 16 bit pointer is used. But if you are using Java, it will convert 16 bit into 32 bit pointer allocation where large memory space is required which also degrades performance in terms of storage and data seek. Because your pointer is large.

Upvotes: 2

Related Questions