Reputation: 392
I'm looking for a way to hold a type (maybe reflection.Type?) as a field in a custom struct. The reasoning behind this is that I'm decoding JSON arrays into structs from which I later build SQL queries, but ints, floats and timestamps are identical in the JSON array, though they are different when querying a database. This means I need to convert each value to its correct type before querying.
I think the answer lies somewhere in the reflect package, but I haven't worked out exactly how to utilize it.
What I'm hoping for is something like this:
type Column struct {
name string
dataType type
}
someColumn := Column {name: "somecol", dataType: int}
convertedVal := SomeConversionFunc("5", someColumn.dataType)
Alternatively, this sort of thing could work as well:
type Column struct {
name string
dataType func()
}
someColumn := Column {name: "somecol", dataType: ConvertToInt}
convertedVal := someColumn.dataType("5")
Any ideas?
Upvotes: 5
Views: 11334
Reputation: 477
The StructTag should help:
type Table struct {
age int `sql:"type:int;`
price float `sql:"type:float;`
}
Upvotes: 0
Reputation: 392
I attempted to use the solution offered by @evanmcdonnal but I could not find a generic way of converting float64
(which is the type json.Unmarshal
gives to any number it unmarshals from the json array) to any data type found in the DB (timestamp
proved to be a bit tricky, because reflect.Value
does not export a conversion method to time.Time
, which is the equivalent of Cassandra's timestamp
).
What did work was using a typeConversion
field instead of a dataType
field, that is, holding a function that converts to the distined column's type from the type json.Unmarshal
sets the variable's type to.
My struct, therefore, looks like this:
type Column struct {
name string
typeConversion func(reflect.Value) reflect.Value
}
Some typeConversion functions I already have look like this:
func floatToTime(varFloat reflect.Value) reflect.Value {
timestamp := time.Unix(int64(varFloat.Float()), 0)
return reflect.ValueOf(timestamp)
}
func floatToInt(varFloat reflect.Value) reflect.Value {
return reflect.ValueOf(int(varFloat.Float()))
}
This actually happens to be a very good solution, because it is extremely generic: the struct defines the way any conversion function should be built, which means I can wrap any form of conversion to fit this API, and because the return value is always reflect.Value
, I can access the underlying value with the correct type by calling Interface()
, like so:
// value is the data unmarshaled by json
// I convert it to reflect.Value, then convert it again using my custom typeConversion
// Then I use Interface() to get the final value in the final (and column-appropriate) type
column.typeConversion(reflect.ValueOf(value)).Interface()
Upvotes: 2
Reputation: 48076
You're on the right track I suppose. In your Column
struct you're looking for reflect.Type
. Which you'll get with import "reflect"
.
If your column struct contains the type name and value (as a raw string) you should be able to write method that switches on type and produces a value of the correct type for each case. Brief reference for switches here; https://golang.org/doc/effective_go.html#switch
type Column struct {
Name string
DataType reflect.Type
TypedValue interface{}
StringValue string
}
Because you said "but ints, floats and timestamps are identical in the JSON array" I'm assuming all the values in your json are technically strings (meaning they're all quoted). You may not need all those fields but the idea is to have the type information coupled with the properties name and the original value. If you use the field of type interface{}
you can assign anything to it so you could hold the original json data (name and value) as well as the type information and the value in a sensible Go type in an instance of this struct pretty easily.
Upvotes: 5