user2250246
user2250246

Reputation: 3967

What codec to use for converting bigint to Long?

I have a bigint field in my Cassandra that I want to convert to a long value. However, I am not sure how to specify the same in my DataStax-mapper-entity-class.

Here is the code:

@PartitionKey(1)
@Column(name="phone", codec=TypeCodec.)
private Double phoneNumber;

Can someone please tell what to use in the Column annotation?

UPDATE
If I use

@PartitionKey(1)
@Column(name="phone")
private Long phoneNumber;

I get an error: java.lang.Double cannot be cast to java.lang.Long

And if I use

@PartitionKey(1)
@Column(name="phone")
private Double phoneNumber;

I get an error: Codec not found for requested operation: [bigint <-> java.lang.Double]

Long or Double does not matter that much to me.
I just want to be able to read it as a number.

I also tried:

@PartitionKey(1)
@Column(name="phone", codec=TypeCodec.PrimitiveDoubleCodec.class)
private Double phoneNumber;

But then it gives me an error: java.lang.NoSuchMethodException: com.datastax.driver.core.TypeCodec$PrimitiveDoubleCodec.<init>()

Upvotes: 1

Views: 2521

Answers (1)

Ashraful Islam
Ashraful Islam

Reputation: 12830

By Default Cassandra bigint type map to java long type,So if you define phone as bigint in cassandra then you have to define phone as long in java. If you want to map bigint to Double you have to write custom codec.

Here is the CQL to Java Type Map :

--------------------------------
| CQL3 data type  | Java type   |
--------------------------------|
| bigint          | long        |
| double          | double      |
---------------------------------

Source : http://docs.datastax.com/en/developer/java-driver/3.1/manual/#cql-to-java-type-mapping

Upvotes: 2

Related Questions