relysis
relysis

Reputation: 1075

Why primitives are not reference type?

I was reading about value and reference types and a question i couldn't find a clear answer was why primitives like int/double, etc are not reference types, like strings for example. I know strings/arrays/other objects can be pretty big compared to ints (which i saw was the primary pro of reference), so the only reason not to make those primitives reference type would be because it would be an over-kill?

Upvotes: 1

Views: 47

Answers (1)

Michael Aaron Safyan
Michael Aaron Safyan

Reputation: 95489

This is only the case in some programming languages, and this is typically done as an optimization (in order to avoid the need to perform memory dereferences or allocations for such simple types). However, there are languages that make basic numeric types and programmer-defined objects look and behave identically, sometimes selecting between a true object and a simple object automatically in the compiler or interpreter to maintain efficiency when the object-like capabilities are not used.

Python and Scala are examples where basic integers and regular objects are indistinguishable; Java, C++, and C are examples where builtin types are distinct from programmer-defined types.

Upvotes: 1

Related Questions