user1492128
user1492128

Reputation: 1

Insert in target table and then update the source table field in oracle

In Oracle, I have a requirement where in I need to insert records from Source to Target and then update the PROCESSED_DATE field of source once the target has been updated.

1 way is to use cursors and loop row by row to achieve the same.

Is there any other way to do the same in an efficient way?

Upvotes: 0

Views: 1365

Answers (4)

user1492128
user1492128

Reputation: 1

Got another answer from some one else. Thought that solution seems much more reasonable than enabling isolation level as all my new records will have the PROCESSED_DATE as null (30 rows which inserted with in the time the records got inserted in Target table) Also the PROCESSED_DATE = NULL rows can be updated only by using my job. No other user can update these records at any point of time.

declare
date_stamp date;
begin
select sysdate
into date_stamp
from dual;

update source set processed_date = date_stamp
where procedded_date is null;
Insert into target 
select * from source
where processed_date = date_stamp;
commit;
end;

/

Let me know any further thoughts on this. Thanks a lot for all your help on this.

Upvotes: 0

Jeffrey Kemp
Jeffrey Kemp

Reputation: 60262

My preferred solution in this sort of instance is to use a PL/SQL array along with batch DML, e.g.:

DECLARE
  CURSOR c IS SELECT * FROM tSource;
  TYPE tarrt IS TABLE OF c%ROWTYPE INDEX BY BINARY_INTEGER;
  tarr tarrt;
BEGIN
  OPEN c;
  FETCH c BULK COLLECT INTO tarr;
  CLOSE c;
  FORALL i IN 1..tarr.COUNT
    INSERT INTO tTarget VALUES tarr(i);
  FORALL i IN 1..tarr.COUNT
    UPDATE tSource SET processed_date = SYSDATE
    WHERE tSource.id = tarr(i).id;
END;

The above code is an example only and makes some assumptions about the structure of your tables.

It first queries the source table, and will only insert and update those records - which means you don't need to worry about other sessions concurrently inserting more records into the source table while this is running.

It can also be easily changed to process the rows in batches (using the fetch LIMIT clause and a loop) rather than all-at-once like I have here.

Upvotes: 0

user330315
user330315

Reputation:

No need for a cursor. Assuming you want to transfer those rows that have not yet been transfered (identified by a NULL value in processed_date).

insert into target_table (col1, col2, col3)
select col1, col2, col3
from source_table
where processed_date is null;

update source_table
   set processed_date = current_timestamp
where processed_date is null;

commit;

To avoid updating rows that were inserted during the runtime of the INSERT or between the INSERT and the update, start the transaction in serializable mode.

Before you run the INSERT, start the transaction using the following statement:

set transaction isolation level SERIALIZABLE;  

For more details see the manual:

Upvotes: 1

VikrantY
VikrantY

Reputation: 594

A trigger should work. The target table can have a trigger that on update, updates the source table's column with the processed date.

Upvotes: 0

Related Questions