Reputation: 1467
I am handling a bunch of BigDecimals, which I want to store to my database. Ideally without any loss of accuracy. I don't know the nicest way to achieve that.
This came to my mind:
t.decimal :value, precision: 1000, scale: 30
It does not look like a good way to approach this problem. 1. it still compromises accuracy. 2. It's is getting unnecessarily large.
Is there a way to store the object, e.g.: #<BigDecimal:586a238,'0.563E0',9(36)>
to the database (within a text
column) and then re-initialize it as a BigDecimal?
Upvotes: 1
Views: 1801
Reputation: 52356
By default a PostgreSQL Decimal has a range of "up to 131072 digits before the decimal point; up to 16383 digits after the decimal point". Is that not enough?
https://www.postgresql.org/docs/9.1/static/datatype-numeric.html
Just use:
t.decimal :value
Upvotes: 1
Reputation: 106882
You might want to look into composed_of
.
But I prefer using custom getter and setter methods because I thing they are easier to read and to understand.
Example: Imagine your attribute is named foo
and you want to use BigDecimal
in the app but store the value as a string in the database:
def foo
BigDecimal.new(read_attribute(:foo))
end
def foo=(foo)
write_attribute(:foo, foo.to_s)
end
Upvotes: 1