Reputation: 559
i have an application which creates database schema's (Oracle 10g) for the users. The access to these schema's expires after a certain time. These schema's can be as large as 2GB in size. The actual operational data for the application is comparatively less. To keep the database size low, what would be the best approach to archive this database schema's considering that these can be restored when required to be accessed by the user.
I am thinking if the following approach:
Convert the Schema in .csv files for each table and then compress the files (zip). Using csv can be an advantage considering its easy to convert csv to/from DB tables.
Please let me know if there is any better approach to do the same. The main aim here is to save the operational DB space.
Upvotes: 0
Views: 1047
Reputation: 36807
Use Data Pump Export and Data Pump Import instead of building a custom tool. Exporting and importing data and metadata is not a trivial task. Data Pump was built for situations like this, it is already including with Oracle, and it has many advanced features.
Here's a very simple example of archiving a schema using data pump.
Create a directory to hold the export. This is only required once per database.
SQL> create directory export_directory as 'C:\test';
Directory created.
Create a test schema and sample data.
SQL> create user test_user identified by test_user;
User created.
SQL> alter user test_user quota unlimited on users;
User altered.
SQL> create table test_user.table1 as select 1 a from dual;
Table created.
Export Data Pump.
C:\test>expdp jheller@orcl12 directory=export_directory dumpfile=test_user.dmp schemas=test_user
Export: Release 12.1.0.1.0 - Production on Thu Jun 12 22:33:25 2014
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Password:
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.1.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Starting "JHELLER"."SYS_EXPORT_SCHEMA_01": jheller/********@orcl12 directory=export_directory dumpfile=test_user.dmp schemas=test_user
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
Total estimation using BLOCKS method: 64 KB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
. . exported "TEST_USER"."TABLE1" 5.031 KB 1 rows
Master table "JHELLER"."SYS_EXPORT_SCHEMA_01" successfully loaded/unloaded
******************************************************************************
Dump file set for JHELLER.SYS_EXPORT_SCHEMA_01 is:
C:\TEST\TEST_USER.DMP
Job "JHELLER"."SYS_EXPORT_SCHEMA_01" successfully completed at Thu Jun 12 22:34:35 2014 elapsed 0 00:00:56
Compress the file.
There is a data pump option to compress the data but it requires the Advanced Compression option. Instead of paying thousands of dollars per core I recommend downloading one of a thousand free software programs that have been compressing data for decades.
C:\test>zip test_user.zip test_user.dmp
adding: test_user.dmp (172 bytes security) (deflated 90%)
C:\test>dir
Volume in drive C is OS
Volume Serial Number is 660C-91D8
Directory of C:\test
06/12/2014 10:37 PM <DIR> .
06/12/2014 10:37 PM <DIR> ..
06/12/2014 10:34 PM 1,435 export.log
06/12/2014 10:34 PM 212,992 TEST_USER.DMP
06/12/2014 10:37 PM 21,862 test_user.zip
3 File(s) 236,289 bytes
2 Dir(s) 689,950,937,088 bytes free
Drop the user.
SQL> drop user test_user cascade;
User dropped.
SQL> select count(*) from test_user.table1;
select count(*) from test_user.table1
*
ERROR at line 1:
ORA-00942: table or view does not exist
Import the user.
C:\test>impdp jheller@orcl12 directory=export_directory dumpfile=test_user.dmp
Import: Release 12.1.0.1.0 - Production on Thu Jun 12 22:41:52 2014
Copyright (c) 1982, 2013, Oracle and/or its affiliates. All rights reserved.
Password:
Connected to: Oracle Database 12c Enterprise Edition Release 12.1.0.1.0 - 64bit Production
With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options
Master table "JHELLER"."SYS_IMPORT_FULL_01" successfully loaded/unloaded
Starting "JHELLER"."SYS_IMPORT_FULL_01": jheller/********@orcl12 directory=export_directory dumpfile=test_user.dmp
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/DEFAULT_ROLE
Processing object type SCHEMA_EXPORT/TABLESPACE_QUOTA
Processing object type SCHEMA_EXPORT/PRE_SCHEMA/PROCACT_SCHEMA
Processing object type SCHEMA_EXPORT/TABLE/TABLE
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. . imported "TEST_USER"."TABLE1" 5.031 KB 1 rows
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type SCHEMA_EXPORT/STATISTICS/MARKER
Job "JHELLER"."SYS_IMPORT_FULL_01" successfully completed at Thu Jun 12 22:42:18 2014 elapsed 0 00:00:19
Check the data.
SQL> select count(*) from test_user.table1;
COUNT(*)
----------
1
Upvotes: 2