choxsword
choxsword

Reputation: 3359

How do we know the maximum size of a theoretically possible object of any type?

Here is the introduction for size_t in cppref

std::size_t can store the maximum size of a theoretically possible object of any type (including array).

I know the exact value is platform-dependent. But who decides the size of theoretically possible object, the compiler or the operating system, or even the computer manufacturer? Could the size of theoretically possible object be calculated or it's just decided by man-made rule?

In addition, if the machine is 64-bit, does that mean the max object size could be 2^64 bytes?

Upvotes: 2

Views: 716

Answers (3)

rustyx
rustyx

Reputation: 85452

if the machine is 64-bit, does that mean the max object size could be 2^64 bytes?

The issue here is that a "64-bit machine" is insufficient information to answer the question. The main limiting factor here is how much sequential memory the instruction set can address at any given time. It is decided by the designer of the CPU architecture.

x86 has a number of execution modes: real mode (16 bit segmented), 32-bit mode and 64-bit mode. In each of these modes the size of the largest sequentially addressable area of memory is different. In real mode 1MB could be addressed, but only 64KB sequentially, hence size_t is 16 bits. In other modes the entire address range can be addressed sequentially.

It doesn't matter that a 64-bit processor can today physically address 48 bits, because it still operates on full 64-bit offsets and can theoretically address a 64-bit-long object. Hence the width of size_t there would still be 64 bits.

In commodity processors the width of size_t usually corresponds to the size of the accumulator register, since that defines the maximum possible offset the CPU can address.

Upvotes: 2

Bo Persson
Bo Persson

Reputation: 92311

It really goes both ways.

As size_t is the return type of the sizeof operator, the type also sets the max size of an object. For sizeof to work, no object can be allowed to exceed the size that size_t can represent.

Nothing requires a compiler to let you build a single object using all available memory. There might be an upper limit, which might be the result of the choice of types for size_t and ptrdiff_t.

On the other hand, with current 64-bit computers, a 64-bit size_t is even way larger than required, as you cannot currently fit 16 exabytes of RAM in a computer smaller than 100 meters (cubed). So much for "theoretically possible". :-}

Upvotes: 3

Jerry Coffin
Jerry Coffin

Reputation: 490338

But who decides the size of theoretically possible object [...] ?

The author of the "implementation" gets to decide. The implementation is a rather nebulous term that includes the compiler, run-time library, and often at least part of the OS.

In addition, if the machine is 64-bit, does that mean the max object size could be 2^64 bytes?

Not really. You probably can't exceed 264-1 bytes, but the limit may well be (normally will be, at least on machine current as of 2018 when I'm writing this) much lower than that. Many current CPUs have much smaller real limits--around 242-1 is probably more realistic.

Upvotes: 6

Related Questions