Reputation: 9102
I have a table cart
:
id | value | metadata
--------+-------+-------------------
45417 | 0 | {"value": "1300"}
45418 | 0 | {"value": "1300"}
276021 | 0 | {"value": "1300"}
and I'm trying to UPDATE the value
column with the value in the JSONB metadata
if it exists. I come up with the following query:
UPDATE cart SET value=CAST(subquery.meta_val as INTEGER) FROM
(SELECT id, metadata->>'value' as meta_val FROM cart
WHERE value = 0 AND
metadata->>'value' IS NOT NULL) as subquery
WHERE cart.id=subquery.id;
Now this works but it takes quite a lot of time for 4M rows I want to update on production and it looks to me like there is a lot of redundancy in the query.
I think the next step would be to wrap all this in a transaction and improve the query, is there anything that can be done to improve performance out of this query ?
Upvotes: 0
Views: 344
Reputation: 604
Try it without a subquery.
update cart as c
set value = coalesce((c.metadata->>'value')::int, 0)
Upvotes: 1