Nick Div
Nick Div

Reputation: 5628

How to partially update millions of records from a large table

I have a table called as Employee, to keep the question short I have not pasted all the columns here but for e.g. sake lets say the desc employee is as follows:

EMPNO      NOT NULL NUMBER(4)    
FIRST_NAME          VARCHAR2(10) 
LAST_NAME           VARCHAR2(10) 
SALARY              NUMBER(24)   
HIREDATE            DATE         
DEPT_ID              VARCHAR2(20) 
ELIGIBILITY         VARCHAR(2)

The table contains more than 500 Million records and I need to run an update on it based on the following where clause:

update employee set ELIGIBILITY = 'N' where DEPT_ID IN 'INS, BAN, AUT';

Now the current update takes more than 24 hours.

I read online on a few forums that in these kind of scenarios you should

  1. insert into a temp table with no logging -> I have 2 questions regarding the first step:

    a) I need to know how to do the no logging part if I choose to insert into a new table

    b) Would I have to write two separate inserts: one for inserting records that match the criteria and one for inserting rest of the records?

  2. Create indexes and constraints in parallel on new table -> Based on what I have read, this is the command to create index in parallel

    CREATE INDEX {new_index} ON employee({column_name}) PARALLEL 35 NOLOGGING;

-- Please correct me here if I am wrong

  1. delete old table
  2. rename the new table.

Like I already mentioned, I need to update only certain rows that match the criteria, not all. How would I go about doing that with the above recommended solution.

Upvotes: 0

Views: 1559

Answers (1)

miracle173
miracle173

Reputation: 1973

update

Your update statement will not work. The correct syntax is

update employee 
    set ELIGIBILITY = 'N' 
    where DEPT_ID IN ('INS', 'BAN', 'AUT');

insert

A temp table sounds strange because Oracle has something that is called temporary table but you may not use a temporary table because in this case your data will be lost after you log out. So let us call it an auxiliary table.

CREATE TABLE auxtable
...
)
NOLOGGING;

will create a table that is eligible for nologging operations. But only if the tablespace or even the whole database in force logging mode.

You now use only one INSERT-statement to insert the data- It has to provide the appropriate data:

INSERT /*+APPEND */ INTO AUXTABLE(
    EMPNO,
    FIRST_NAME,
    LAST_NAME,
    SALARY,
    HIREDATE,  
    DEPT_ID,
    ELIGIBILITY)
SELECT
    EMPNO,
    FIRST_NAME,
    LAST_NAME,
    SALARY,
    HIREDATE,  
    DEPT_ID,
    DECODE(DEPT_ID,
        'INS', 'N',
        'BAN', 'N',
        'AUT', 'N',
        ELIGIBILITY
        )
FROM EMPLOYEE;

The APPEND hint advises Oracle to do the insert NOLOGGING mode if this is enabled on the table.

index

After index creation you should chnage the parallel clause of the index

ALTER INDEX {new_index} NOPARALLEL;

or to the value that the index in the source table uses.

You can use graphical tools like Oracle Grid Control or SQL-Developer to generate the extract the sql of a table or you can use expdp/impdp to generate an sqlfile.

$ expdp myuser content=metadata_only tables=myuser.MYTABLE DUMPFILE='mydump.dp'

Export: Release 11.2.0.4.0 - Production on Sun Jan 01 01:00:00 2017

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.
Password:

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Starting "MYUSER"."SYS_EXPORT_TABLE_01":  myuser/******** content=metadata_only tables=myuser.MYTABLE DUMPFILE=mydump.dp
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Master table "MYUSER"."SYS_EXPORT_TABLE_01" successfully loaded/unloaded
******************************************************************************
Dump file set for MYUSER.SYS_EXPORT_TABLE_01 is:
  /mypath/mydump.dp



$ impdp myuser sqlfile=mytable.sql  tables=myuser.MYTABLE DUMPFILE='mydump.dp'

Import: Release 11.2.0.4.0 - Production on Sun Jan 01 01:00:10 2017

Copyright (c) 1982, 2011, Oracle and/or its affiliates.  All rights reserved.
Password:

Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "MYUSER"."SYS_SQL_FILE_TABLE_01" successfully loaded/unloaded
Starting "MYUSER"."SYS_SQL_FILE_TABLE_01":  myuser/******** sqlfile=mytable.sql tables=myuser.MYTABLE DUMPFILE=mydump.dp
Processing object type TABLE_EXPORT/TABLE/TABLE
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Job "MYUSER"."SYS_SQL_FILE_TABLE_01" successfully completed at Sun Jan 01 01:00:15 2017 elapsed 0 00:00:05

The Sqlfile will be here:
/mypath/mytable.sql

Upvotes: 1

Related Questions