Reputation: 5800
I have a large json object stored in a postgres table column, an example of the schema is like this:
create_table “document” do |t|
t.jsonb “data”, default: []
end
At the moment I’m updating json in the column like so:
# find document in rails then…
doucment.data[‘some_attribute’][2][‘another_attribute’] = 100
doucment.save
However I write this json attribute many times and sometimes data becomes lost because if two calls are writing it at the same time, then the whole object will be saved or over written with the current objects old data.
For example if there’s two different saves going through at the same with the following
Save 1:
doucment.data[‘some_attribute’][2][‘another_attribute’] = 100
doucment.save
Save 2:
doucment.data[‘some_attribute’][2][‘different_attribute’] = 200
doucment.save
then either one of the attributes data will be lost because the other one will save it’s json but with old data that hasn’t been refreshed yet.
What is the best way to go about making both calls save the new data correctly.
Is there any json method that can just go in and update one attribute, like update_attribute
but for a jsonb attribute?
Upvotes: 5
Views: 3772
Reputation: 526
This occurs because you update the document
asynchronously, it has nothing to do with the attribute being a json.
The right thing to do here would be to lock the model before modifying it, that way it won't be overridden, but updated one step at a time.
Try the following:
Document.lock.find_by(...)
ActiveRecord::Locking::Pessimistic
Upvotes: 0
Reputation: 4709
Postgresql has the jsonb_set
method. You can update your model using this method.
With this code, the record is updated in a single query:
Document.where(id: document.id).update_all(
"data = jsonb_set(data, '{some_attribute,2,another_attribute}', to_json(#{your_int_value}::int)::jsonb)"
)
Upvotes: 5