Reputation: 5683
I'm using Kettle for a transformation and hitting this same issue trying to get a value for an integer based 'out' parameter in mysql, (the field should actually be bigint
but I don't think it's supported in Kettle).
I've implemented the workaround for this exact issue with decimal but I'm having a hell of a time trying to get the value firstly back into the id field (not id_1
), and secondly to convert it back to an integer so it can be inserted into the database.
This is the relevant flow:
This is the step definition:
On the inbound stream coming out of lookup existing id
there is already a field added called id
which is a decimal that may be either a value, or null. If it's null it triggers this db procedure lookup at the filter rows
step, so in other words, it will always be null at the db procedure
step (and hence I can't use an in-out parameter here).
So what I want to happen is when it leaves the db procedure
step, id
should be populated with the value of the out parameter (ideally it would be an integer, but I can live with decimal because of the bug). But instead I get id_1
inserted into the stream.
Then further down the line, assuming I have the value in id field as a decimal, I then need to convert that back to an Integer so it can be inserted into the database without error. I don't really know how to write the javascript in Kettle though and I can't find the documentation on the language.
So my questions are twofold:
id
back into the stream?id
(or id_1
) into an Integer and places it back into the id
field in the stream?Upvotes: 3
Views: 766
Reputation: 1822
Kettle build on Java and use JDBC driver. Let me summarize some sources and investigate issue ( I will include Postgresql, since i use it most and this information valuable for me)
First let's check size of data types: java.util.Long is 8 byte long, mysql bigint type is 8 byte long, postgresql bigint type as well 8 byte long. And by physical size they actually match except one problem. Mysql support unsigned bigint, which is out of scope java.util.Long. I assume problem appears on edge values, min and max value of java.util.Long.
Anyway i tried to reproduce (on Postgres only available at this moment to me)
Postgresql 9.4, JDBC Driver postgresql-9.4-1201-jdbc41.jar, Kettle 5.4.0, jdk7
http://forums.pentaho.com/showthread.php?48950-loosing-precision-of-BIGINT-workaround
and everything looks good. Most probably it is problem with mysql jdbc driver or it is a problem with kettle which doesn't support java.math.BigInteger. java.math.BigInteger is data type, used to work with unsigned bigint values in mysql as described in here
http://dev.mysql.com/doc/connector-j/en/connector-j-reference-type-conversions.html
About actual problem. Example of solution in here below:
Sample csv
"id" "surname" "name" "birth_dt"
"1" "Gorovits" "Alex" "2001-01-01"
"2" "Osmani" "Adeb" "1998-03-06"
"" "Maiters" "John" "1981-07-07"
"" "Mateus" "Angela" "2004-04-04"
"5" "Sander" "Tom" "1990-05-05"
Notes:
Upvotes: 1