Reputation: 3625
A small part of an application I'm using to test some expected behavior is giving different output, depending on what processor I run it on. Here's the relevant part of the code:
for b := 0; b < intCounter; b++ {
//int64Random = rand.Int63()
int64Random = int64(rand.Int())
//CHECKING FOR SANITY
fmt.Println("int64Random is " + strconv.FormatInt(int64Random, 10))
slcTestNums = append(slcTestNums, int64Random)
}
When I run this on my Mac (amd64, darwin) I get output like:
int64Random is 2991558990735723489
int64Random is 7893058381743103687
int64Random is 7672635040537837613
int64Random is 1557718564618710869
int64Random is 2107352926413218802
When I run this on a Pi (arm, linux) I get output like:
int64Random is 1251459732
int64Random is 1316852782
int64Random is 971786136
int64Random is 1359359453
int64Random is 729066469
If on the Pi I change the int64Random to = rand.Int63() and recompile, I get output like:
int64Random is 7160249008355881289
int64Random is 7184347289772016444
int64Random is 9201664581141930074
int64Random is 917219239600463359
int64Random is 6015348270214295654
...which more closely matches what the Mac is getting. Is this because of something that is changed at runtime due to the processor architecture? Why is int64(rand.Int())
generating int64-ranged numbers instead of keeping an int-ranged number, but changing the type for the variable it's being stored in? Am I missing the Go documentation that mentions this behavior?
Upvotes: 0
Views: 125
Reputation: 5676
According to https://golang.org/doc/go1.1
The language allows the implementation to choose whether the int type and uint types are 32 or 64 bits. Previous Go implementations made int and uint 32 bits on all systems. Both the gc and gccgo implementations now make int and uint 64 bits on 64-bit platforms such as AMD64/x86-64
rand.Int() returns int. On amd64 it will be 64 bits, on ARM it will be 32 bits
Upvotes: 1