Reputation: 720
Sometimes its needed to take a backup of schema metadata of all objects with only limited rows from each table with Oracle DataPump (export and import).
In my example, I'll take the export of 20 rows from each table with all metadata, you can modify .par accordingly. Will run two export command
Also, you can extract the DDL from a dump into a .SQL file.
In this way, we can have the data in a small limited-size file for our testing purpose to import into another database.
Upvotes: 1
Views: 4268
Reputation: 720
CREATE DIRECTORY dirname AS '/path/path/path';
Grant read,write on DIRECTORY dirname to public;
expdp Username/password SCHEMAS=SCH_NAME DIRECTORY=DIR_NAME dumpfile=DMP_FILENAME.dmp CONTENT=METADATA_ONLY
vi exp_filename.par
DIRECTORY=ITO_DATA_PUMP_DIR
DUMPFILE=DMP_FILENAME.dmp
LOGFILE=DMP_FILENAME.log
SCHEMAS=USER
INCLUDE=TABLE:"IN(select table_name from dba_tables where owner ='USER')"
QUERY="where rownum < 20"
expdp username/password parfile=exp_filename.par
impdp Username/password DIRECTORY=DIR_NAME dumpfile=DMP_FILENAME.dmp SQLFILE=exp_rajesh_all.sql
Another example of exporting with multiple queries and keeping only the latest data based on the date column and also can be modified according to your requirements.
DIRECTORY = my_dir
DUMPFILE = exp_query.dmp
LOGFILE = exp_query.log
SCHEMAS = hr, scott
INCLUDE = TABLE:"IN ('EMP', 'DEPARTMENTS')"
#TABLES =(scott.emp,scott.dept,scott.salgrade)
QUERY = scott.emp:"WHERE job = 'ANALYST' OR sal >= 3000"
# Place following 3 lines on one single line:
QUERY = hr.departments:"WHERE department_id IN (SELECT DISTINCT
department_id FROM hr.employees e, hr.jobs j WHERE e.job_id=j.job_id
AND UPPER(j.job_title) = 'ANALYST' OR e.salary >= 3000)"
# With date filter
QUERY="where change_date > to_date('31-Dec-2020','DD-MON-YYYY')"
Note: Above parameter file is not tested but you can use it as a reference. Also, we can add other data pump parameters as per export/import requirements in the .par file.
Upvotes: 1