ennezetaqu
ennezetaqu

Reputation: 25

DataStage Job writes zeros instead of file values

I have a problem with a DataStage parallel Job, that fails in writing the correct values of a column of a txt file. Instead of the values contained in the file, it writes 0 in every record.

The structure of the Job is very simple: Sequential File -> Transformer -> Oracle Connector.

I tried to make a simpler job that extract data from a file and writes it into another file and it worked. That makes me think that the problem is in the transformer stage. I thought it was due to the convertion from varchar to numeric of the two fields mentioned above, but the problem is relative only to one of them.

The starting point is something like this:

col1;col2;col3;col4 123;"abc";123;123 123;"abc";123;123 123;"abc";123;123 123;"abc";123;123 123;"abc";123;123

and the output should be exactly the same when inserted in the table. Instead, the result is

col1;col2;col3;col4 123;"abc";123;0 123;"abc";123;0 123;"abc";123;0 123;"abc";123;0 123;"abc";123;0

Thanks in advance to anyone who will help me.

Upvotes: 0

Views: 65

Answers (1)

MichaelTiefenbacher
MichaelTiefenbacher

Reputation: 4005

Hard to say without seeing the job with all the details - but 0 is the default for a numeric datatype and it will be set if the column defined as NOT NULL. An explicit convert is recommended if you want to change data types.

An other problem could be blanks at the end of a row which would mess up the data type as well.

You could add a peek stage between the transformer and the database target to see how the data looks inside DataStage and go on debugging from that.

Upvotes: 0

Related Questions