Reputation: 3
I tried loading data into table using sql loader.
The log shows actual length of the string is 101 where as 100 is maximum(Rejects the record).But when i checked ,I found the length is 99.
data type of the string is varchar2(100) in table
I didnt specify anything about length in control file
What would be the exact problem?
Upvotes: 0
Views: 3610
Reputation: 191235
Your data value only has 99 characters, but it seems some are multibyte characters - from a comment at least one is the symbol ½
.
There are two related way to see this behaviour, depending on how your table is defined and what is in your control file.
You're probably seeing the effect of character length semantics. Your column is defined as 100 bytes; you're trying to insert 99 characters, but as some characters require multiple bytes for storage, the total number of bytes required for your string is 101 - too many for the column definition.
You can see that effect here:
create table t42 (str varchar2(10 byte));
Then if I have a data file with one row that has a multibyte character:
This is 10
This is 9½
and a simple control file:
LOAD DATA
CHARACTERSET UTF8
TRUNCATE INTO TABLE T42
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(
STR
)
Then trying to load that gets:
Record 2: Rejected - Error on table T42, column STR.
ORA-12899: value too large for column "MYSCHEMA"."T42"."STR" (actual: 11, maximum: 10)
Total logical records read: 2
Total logical records rejected: 1
If I recreate my table with character semantics:
drop table t42 purge; create table t42 (str varchar2(10 char));
then loading with the same data and control file now gets no errors, and:
Total logical records read: 2
Total logical records rejected: 0
However, even when the table is defined with character semantics, you could still see this; if I remove the line CHARACTERSET UTF8
then my environment defaults (via NLS_LANG, which happens to set my character set to WE8ISO8859P1) leads to a character set mismatch and I again see:
Record 2: Rejected - Error on table T42, column STR.
ORA-12899: value too large for column "STACKOVERFLOW"."T42"."STR" (actual: 11, maximum: 10)
(Without that control file line, and with byte semantics for the column, the error reports actual length as 13 not 11).
So you need the table to be defined to hold the maximum number of characters you expect, and you need the control file to specify the character set if your NLS_LANG is defaulting it to something that doesn't match the database character set.
You can see the default semantics a new table will get by querying, for the database default and your current session default:
select value from nls_database_parameters where parameter = 'NLS_LENGTH_SEMANTICS';
select value from nls_session_parameters where parameter = 'NLS_LENGTH_SEMANTICS';
For an existing table you can check which was used by looking at the user_tab_columns.char_used
column, which will be B for byte semantics and C for character semantics.
Upvotes: 4