Ashish Aggarwal
Ashish Aggarwal

Reputation: 351

Mapping between BIGINT in Sql with BigInteger in Java

We are facing some issue in which when we try to map BigInteger in Java with BigInt in sql, its value got changed. We also try to convert it to longValue before mapping but it also failing as long could not able to handle its value. We tried with toString() and it got worked but is their any other workaround apart from using toString() ?

Upvotes: 4

Views: 13875

Answers (2)

Dai
Dai

Reputation: 155055

Java's BigInteger does not correspond with SQL Server's bigint - despite the similar names they are almost completely different:

  • In Java, BigInteger is a type that represents an arbitrary-precision integer. Internally is is represented by a non-sparse byte vector that is free to resize as values increase.
  • In T-SQL, bigint represents fixed 64-bit integers. The Java equivalent is long. Note that both T-SQL's bigint and Java's long are signed-types. It is an intrinsic type for a value that is natively handled by the host computer's hardware.

It's true that BigInteger can represent any bigint value, but the reverse is not true: you cannot store most BigInteger values in a bigint column.

As you say you're using BigInteger in Java as your canonical value, it means you cannot use bigint in your table, you should instead use varbinary(n) (where n is some reasonable upper-bound for the size of your BigInteger values, then use it like this:

yourTableRow.hugeValue = yourBigIntegerValue.toByteArray()

Where your entity class yourTableRow's hugeValue member is of type byte[] which is compatible with varbinary(n) in T-SQL.

I advise against using varbinary(MAX) owing to how blob values are stored[1]. It is highly unlikely you'll encounter integer values beyond 2^128 (16 bytes) or even 2^256 (32 bytes) so you shouldn't need more than varbinary(16) or varbinary(32).

If you know your numbers won't exceed 2^128 you'd be better off storing it as two separate bigint (64-bit) values:

COLUMN hugeValueUpper bigint NOT NULL,
COLUMN hugeValueLower bigint NOT NULL

And some Java to extract the upper 8 bytes and lower 8 bytes:

byte[] bytes = someBigIntegerValue.toByteArray();
if( bytes.length > 16 ) throw ...

yourRow.hugeValueLower = (long)bytes[ 0] << 56) | 
                         (long)bytes[ 1] << 48) |
                         (long)bytes[ 2] << 40) |
                         (long)bytes[ 3] << 32) |
                         (long)bytes[ 4] << 24) |
                         (long)bytes[ 5] << 16) |
                         (long)bytes[ 6] <<  8) |
                         (long)bytes[ 7]      );
yourRow.hugeValueUpper = (long)bytes[ 8] << 56) |
                         (long)bytes[ 9] << 48) |
                         (long)bytes[10] << 40) |
                         (long)bytes[11] << 32) |
                         (long)bytes[12] << 24) |
                         (long)bytes[13] << 16) |
                         (long)bytes[14] <<  8) |
                         (long)bytes[15]      );

And vice-versa to get values out.

[1] It is true that SQL Server will optionally store varbinary(MAX) values inline in a row if the value is small enough, but I would consider any BigInteger value that is too big to be stored in-row (and so would be moved to BLOB storage) as being symptomatic of an error situation, so setting a low bound on a varbinary(n) column means your program will fail-faster.

Upvotes: 6

StuartLC
StuartLC

Reputation: 107237

Java's BigInteger is an arbitrary length integer, limited only by available memory or implementation limitations, whereas SqlServer's BIGINT type is simply a 64 bit signed integer. You are likely experiencing truncation once the value stored in your BigInteger exceeds 2^63-1.

If you are dealing with values exceeding a signed 64 bit integer, you will need to resort to alternatives, such as using strings (e.g. VARCHAR(MAX)), or an array of bytes for storage.

Upvotes: 0

Related Questions