Reputation: 11
I have a Data flow in Azure Data Factory which I want to use to combine data from three sources and then sink in a destination table (with some transformation in-between). For the sink table I created a table in SQL, matching the column headers and data types from my Data Flow in Azure.
However when I publish the data flow, the sink table remains empty. The only error I get is under Mapping "At least one incoming column is mapped to a column in the sink dataset schema with a conflicting type, which can cause NULL values or runtime errors." This seems to be inhibiting me from enabling Auto Mapping - so I mapped the columns manually.
So where I'm at the moment:
DataFlowLayout
SinkTableDataMapping
Anyone experience something similar?
Upvotes: 0
Views: 7867
Reputation: 11
Thanks for the feedback. In the end the issue was that the pipeline which sinks the data from the flow, in the destination table was not properly set up - that is why the dataflow was not showing any errors but the sink table still remained empty. So the dataflow was kind of hanging in the air with no instruction to actually perform the sink
Upvotes: 0
Reputation: 3228
You can try below steps:
Disable auto mapping of columns in Sink Transformation and manually map columns.
And check Allow insert option selected under sink transformation settings.
Also make sure all column data types of input and output of Sink transformation should match to avoid nulls.
Upvotes: 0