Reputation: 5629
I have a postgres table with a creation_time
field defined like this:
creation_time timestamp with time zone not null default now(),
And I have a protobuf message defined like this:
message Thing {
//...Other fields here
google.protobuf.Timestamp creation_time = 1;
}
After generating my go struct with protoc, I end up with a Go struct like this:
type Thing struct {
CreationTime *timestamppb.Timestamp `protobuf:"bytes,3,opt,name=creation_time,json=creationTime,proto3" json:"creation_time,omitempty"`
}
I use sqlx to read and write into a postgres database. I want to read a database record and deserialize it into a Thing
struct. The deserialization works for all the "trivial" fields (ints, strings) but the timestamp is giving me trouble. As you can see above, the CreationTime
field ends up being of type *timestamppb.Timestamp
and sqlx doesn't know how to handle this type (it doesn't have the Value
and Scan
methods for this type).
I would like to know how people handle this situation, I believe it must happen all the time as it's so fundamental.
I am aware that I could use the protobuf struct only for the communication with my API and use a normal go struct for internal business logic, and write some adapters to handle the conversion. I don't want to go this way, I would like to use the proto struct everywhere. I am also aware that I could use lots of sqlx.Scan
to specifically handle the timestamps. But I'd have to do that in multiple places in my code and I don't like this idea.
I was thinking about implementing my own timestamp proto type (might be just a wrapper around google's timestamp type), just so I can implement the scanner and valuer interface.
How do you people normally do?
Upvotes: 1
Views: 812