Reputation: 974
I'm trying to unload the data from Amazon redshift table and then copying the data from s3 file to redshift by using following command.
unload ('select * from abx.xuz where src =\'vpc\'') to 's3://load/2017/' credentials 'aws_access_key_id=******;aws_secret_access_key=*****' allowoverwrite;
copy abc.abxy from 's3://sumo-processed/2017/'
credentials 'aws_access_key_id=XXXX;aws_secret_access_key=YY' NULL AS '\\0' TRUNCATECOLUMNS FILLRECORD maxerror as 255;
But it is not able load all the records. After checking the stl_load_errors table it is showing "Invalid digit, Value '\', Pos 0, Type: Long" error.
How to resolve this error?. SO I can load all recording without skipping?
Upvotes: 3
Views: 19947
Reputation: 11
I was having similar error "Invalid digit, Value '.', Pos 2, Type: Integer", and the reason was pandas.io.sql.read_sql converting all "INT NULL" columns into "FLOAT"; so I had 10.0s instead of 10s in the dataframe. Then I dump the dataframe to S3, and then I import it from S3 to REDSHIFT using COPY , getting that error because COPY cannot import 10.0 to REDSHIFT's INT4.
Fix: replaced all "INT NULL" fields in the destination REDSHIFT table with "DECIMAL(10,0)"; after that COPY works like a charm.
Upvotes: 1