PyQL
PyQL

Reputation: 1830

import oracle dump file where source and destination databases have different character sets

I am trying to import a .dmp file that was exported using Oracle Database 11g Enterprise Edition into Oracle 11g XE running on Windows 7 but I receive many errors like the following:

ORA-02374: conversion error loading table "SCHEMA"."TABLENAME"
ORA-12899: value too large for column COLNAME (actual: 90, maximum: 75)

The import command I use:

impdp system/pwd remap_schema=OLD_SCHEMA:NEW_SCHEMA tables=OLD_SCHEMA.Table1,OLD_SCHEMA.Table2 directory=DATA_PUMP_DIR dumpfile=mydump.dmp logfile=import.log exclude=grant,index,statistics

The charset of the dump file is WE8ISO8859P1 while my destination database is AL32UTF8

I read somewhere that Oracle 11g XE support AL32UTF8 only. Hence, I cannot alter this character set to match the source.

Is there any way I can import the dump file without getting the conversion errors?

Thanks

Upvotes: 1

Views: 1104

Answers (1)

Rene
Rene

Reputation: 10551

The problem is that some of the characters will take up more than one byte per character. You will need to change your database nls_length_semantics to 'CHAR' and redo the import. A step by step explanation is to be found here: http://albertolarripa.com/2012/06/10/ora-12899-changing-columns-to-char/

Upvotes: 1

Related Questions