Reputation: 9602
I'd like to create a Postgres trigger on a table to run when a row gets inserted or updated. The table has many columns, and I'd like the trigger to insert that row into another table. But in that other table, all those columns should be combined into one JSON object (JSONB in newer versions of Postgres).
original table
column1|column2|column3 |
-------|-------|--------|
A |B |C |
new table
combined_column |
---------------------------------------|
{ column1: A, column2: B, column3: C } |
So the table that the trigger is created on would have for example 3 columns, but the table that the trigger inserts into would have only 1 column (a JSON object combining all the columns for the inserted/updated row in the original table).
Upvotes: 2
Views: 1120
Reputation: 657122
It would be more efficient to save rows in original form. No transformation needed, occupies less disk space, faster, cleaner.
Just create a log table with identical structure:
CREATE TABLE tbl_log AS TABLE tbl LIMIT 0;
Or use the LIKE
keyword to specify more closely what to take from the original with INCLUDING
clauses. Example:
CREATE TABLE tbl_log (LIKE tbl INCLUDING STORAGE);
Trigger function:
CREATE OR REPLACE FUNCTION trg_tbl_log()
RETURNS trigger
LANGUAGE plpgsql AS
$func$
BEGIN
INSERT INTO tbl_log VALUES (NEW.*);
RETURN NEW;
END
$func$;
Trigger:
CREATE TRIGGER tbl_log
BEFORE INSERT OR UPDATE ON tbl
FOR EACH ROW EXECUTE PROCEDURE trg_tbl_log();
In Postgres 11 or later, rather use cleaner syntax:
...
FOR EACH ROW EXECUTE FUNCTION trg_tbl_log();
You can easily transform the row into a json
value if you need to, with row_to_json()
. Or simpler, just to_json()
. It might be better to use to_jsonb()
and save jsonb
instead of json
:
...
INSERT INTO generic_js_log (json_column) SELECT to_jsonb(NEW);
...
Upvotes: 1