Reputation: 13
I have a table representing an object:
id|field1|field2|field3
--|------|------|------
1 | b | f | z
2 | q | q | q
I want to pass several objects to pg function which will update corresponding rows.
Now I see only one way to do it - to pass a jsonb with array of objects. For example:
[{"id":1, "field1":"foo", "field2":"bar", "field3":"baz"},{"id":2, "field1":"fooz", "field2":"barz", "field3":"bazz"}]
Is this a best way to perform an update? And what is the best way to do it with jsonb input?
I don't really like the way to convert jsonb input to rows with select * from json_each('{"a":"foo", "b":"bar"}')
and operating it. I would prefer some way to execute a single UPDATE
.
Upvotes: 0
Views: 84
Reputation: 10163
You can do it in next way (using json_populate_recordset):
update test
set
field1 = data.field1,
field2 = data.field2,
field3 = data.field3
from (select * from json_populate_recordset(
NULL::test,
'[{"id":1, "field1":"f.1.2", "field2":"f.2.2", "field3":"f.3.2"},{"id":2, "field1":"f.1.4", "field2":"f.2.5", "field3":"f.3.6"}]'
)) data
where test.id = data.id;
Upvotes: 0
Reputation: 1197
this can be achieved using from
clause in update
along with a dynamic on the fly table from input as follows assuming that the DB table and custom input will be mapped/matched with each other on basis of ID
update table as t1
set field1 = custom_input.field1::varchar,
field2 = custom_input.field2::varchar,
field3 = custom_input.field3::varchar
from (
values
(1, 'foo', 'bar', 'baz'),
(2, 'fooz', 'barz', 'bazz')
) as custom_input(id, field1, field2, field3)
where t1.id = custom_input.id::int;
Upvotes: 1