zundarz
zundarz

Reputation: 1594

What can I do to ensure fields longer than column width go to the BAD File?

When creating Oracle external tables, how should I phrase the reject rows clause to ensure that any field which exceeds its column width rejects and goes in the BADFILE?

This is my current design and I don't want records greater than 20 characters. I do want them to go BADFILE instead. Yet, they still appear when I select * from foobar

DROP TABLE FOOBAR CASCADE CONSTRAINTS;

CREATE TABLE FOOBAR
(
 FOO_MAX20  VARCHAR2(20 CHAR)
 )
ORGANIZATION EXTERNAL
 (  TYPE ORACLE_LOADER
 DEFAULT DIRECTORY FOOBAR
 ACCESS PARAMETERS 
   ( RECORDS DELIMITED BY NEWLINE
     BADFILE     'foobar_bad_rec.txt'
     DISCARDFILE 'foobar_discard_rec.txt'
     LOGFILE     'foobar_logfile.txt'
     FIELDS
     MISSING FIELD VALUES ARE NULL
     REJECT ROWS WITH ALL NULL FIELDS 
            (
            FOO_MAX20 POSITION(1:20)
            )
      )
 LOCATION (foobar:'foobar.txt') )
REJECT LIMIT UNLIMITED
PARALLEL ( DEGREE DEFAULT INSTANCES DEFAULT )
NOMONITORING;

Here is my external file foobar.txt

1234567
1234567890123456
126464843750476074218751012345678901234567890
7135009765625
048669433593
7
527

Upvotes: 1

Views: 698

Answers (2)

Alex Poole
Alex Poole

Reputation: 191275

You can't do this with the reject rows clause, as it only accepts one form.

You have a variable-length (delimited) record, but a fixed-length field. Everything after the last position you specify, which is 20 in this case, is seen as filler that you want to ignore. That isn't an error condition; you might have rubbish at the end that isn't relevant to your table. There is nothing that says chars 21-45 in your third record shouldn't be there - just that you aren't interested in them.

It would be nice if you could discard them with the load when clause, but you don't seem to be able to compare , say, (21:21) to null or an empty string - the former isn't recognised and the latter causes an internal error, which isn't good.

You can make the longer records be sent to the bad file by forcing an SQL error when it tries to put a longer parsed value from the file into the field, by changing:

FOO_MAX20 POSITION(1:20)

to

FOO_MAX20 POSITION(1:21)

Values that are up to 20 characters are still loaded:

select * from foobar;

FOO_MAX20          
--------------------
1234567              
1234567890123456     
7135009765625        
048669433593         
7                    
527                  

 6 rows selected 

but for anything longer than 20 characters it'll try to put 21 chars in to the database's 20-char field, which gets this in the log:

error processing column FOO_MAX20 in row 3 for datafile /path/to/dir/foobar.txt
ORA-12899: value too large for column FOO_MAX20 (actual: 21, maximum: 20)

And the bad file gets that record:

126464843750476074218751012345678901234567890

Upvotes: 2

Lalit Kumar B
Lalit Kumar B

Reputation: 49082

Have a CHECK CONSTRAINT on the column to not allow any value exceeding the `LENGTH'.

Upvotes: 0

Related Questions