Rabiul Aleem
Rabiul Aleem

Reputation: 13

not able to do copy activity with bit value in azure data factory without column mapping for sink as postgresql

I've multiple csv files in folder like employee.csv, student.csv, etc.,.. with headers And also I've tables for all the files(Both header and table column name is same).

employee.csv

id|name|is_active

1|raja|1

2|arun|0

student.csv

id|name

1|raja

2|arun

Table Structure:

emplyee:

id INT, name VARCHAR, is_active BIT

student:

id INT, name VARCHAR

now I'm trying to do copy activity with all the files using foreach activity, the student table copied successfully, but the employee table was not copied its throwing error while reading the employee.csv file.

Error Message:

{"Code":27001,"Message":"ErrorCode=TypeConversionInvalidHexLength,Exception occurred when converting value '0' for column name 'is_active' from type 'String' (precision:, scale:) to type 'ByteArray' (precision:0, scale:0). Additional info: ","EventType":0,"Category":5,"Data":{},"MsgId":null,"ExceptionType":"Microsoft.DataTransfer.Common.Shared.PluginRuntimeException","Source":null,"StackTrace":"","InnerEventInfos":[]}

Upvotes: 0

Views: 1460

Answers (1)

Abhishek Khandave
Abhishek Khandave

Reputation: 3230

Use data flow activity.

In dataflow activity, select Source.

After this select derived column and change datatype of is_active column from BIT to String.

As shown in below screenshot, Salary column has string datatype. So I changed it to integer. enter image description here

To modify datatype use expression builder. You can use toString() enter image description here

enter image description here

This way you can change datatype before sink activity.

In a last step, provide Sink as postgresql and run pipeline.

Upvotes: 0

Related Questions