Reputation: 2052
Oracle 11gR2 (x86 Windows):
I have a db with 250 tables with indexes and constraints. I need to re-create these tables, indexes and constraints in a new db and load the data. I need to know how to do the following in SQL Plus and/or SQL Developer, unless there's a magical utility that can automate all of this. Thanks in advance!
Unload (export) all the data from the 250 tables.
Create an sql script file containing the CREATE TABLE
statements for the 250 tables.
Create an sql script file containing the CREATE INDEX
statements for the 250 tables.
Create an sql script file containing the ALTER TABLE ADD CONSTRAINT
statements for the 250 tables.
Run the script to create the tables in a new db.
Load the exported data into the tables in the new db.
Run the script to create all the indexes.
Run the script to add all the contraints.
EDIT: I'm connected to the remote desktop which links to the source db on a Windows Server 2008. The remote only has an Oracle client installed. For security reasons, I'm not allowed to link directly from my local computer to the Win Server, so can I dump the whole source db to the remote then zip it to my local target machine? I'm trying to replicate the entire db on my computer.
Upvotes: 4
Views: 4707
Reputation: 551
SQL Developer can help with #1 by creating INSERT
statements with a formatted query result:
Select /*insert*/ *
from My_Table;
Upvotes: -2
Reputation: 52000
Starting from Oracle 10g, you could use the Data Pump command-line clients expdb
and impdb
to export/import data and/or schema from one DB to an other. As a matter of fact, those two command-line utilities are only wrappers that "use the procedures provided in the DBMS_DATAPUMP PL/SQL package to execute export and import commands, using the parameters entered at the command line." (quoted from Oracle's documentation)
Given your needs, you will have to create a directory then generate a full dump of your database using expdb
:
SQL> CREATE OR REPLACE DIRECTORY dump_dir AS '/path/to/dump/folder/';
sh$ expdp system@db10g full=Y directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
As the dump is written using some binary format, you will have to use the corresponding import utility to (re)import your DB. Basically replacing expdb
by impdb
in the above command:
sh$ impdp system@db10g full=Y directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
For simple table dump, use that version instead:
sh$ expdp sylvain@db10g tables=DEPT,EMP directory=DUMP_DIR dumpfile=db.dmp logfile=db.log
As you noticed, you can use it with your standard user account, provided you have access to the given directory (GRANT READ, WRITE ON DIRECTORY dump_dir TO sylvain;
).
Upvotes: 6
Reputation: 17429
If you can create a database link from your local database to the one that currently contains the data, you can use the DBMS_DATAPUMP
package to copy the entire schema. This is an interface to Datapump (as @Sylvain Leroux mentioned) that is callable from within the database.
DECLARE
dph NUMBER;
source_schema VARCHAR2 (30) := 'SCHEMA_TO_EXPORT';
target_schema VARCHAR2 (30) := 'SCHEMA_TO_IMPORT';
job_name VARCHAR2 (30) := UPPER ('IMPORT_' || target_schema);
p_parallel NUMBER := 3;
v_start TIMESTAMP := SYSTIMESTAMP;
v_state VARCHAR2 (30);
BEGIN
dph :=
DBMS_DATAPUMP.open ('IMPORT',
'SCHEMA',
'DB_LINK_NAME',
job_name);
DBMS_OUTPUT.put_line ('dph = ' || dph);
DBMS_DATAPUMP.metadata_filter (dph,
'SCHEMA_LIST',
'''' || source_schema || '''');
DBMS_DATAPUMP.metadata_remap (dph,
'REMAP_SCHEMA',
source_schema,
target_schema);
DBMS_DATAPUMP.set_parameter (dph, 'TABLE_EXISTS_ACTION', 'REPLACE');
DBMS_DATAPUMP.set_parallel (dph, p_parallel);
DBMS_DATAPUMP.start_job (dph);
DBMS_DATAPUMP.wait_for_job (dph, v_state);
DBMS_OUTPUT.put_line ('Export/Import time: ' || (SYSTIMESTAMP - v_start));
DBMS_OUTPUT.put_line ('Final state: ' || v_state);
END;
The script above actually copies and renames the schema. If you want to keep the same schema name, I believe you'd just remove the metadata_remap
call.
Upvotes: 2