user13128577
user13128577

Reputation:

Azure Data Factory Copy Pipeline Did Not Copy Data Into The Sink

I created a test copy pipeline in azure data factory which basically reads 2 sample rows from a text file in azure blob storage and loads it into a table in Azure sql database, the run was successful.

However no records were inserted into the table. The source file is only 34 bytes, i read the minimum block size for azure blob storage is 64KB, could it because my test file is too small that azure failed to read it even though the pipeline ran successfully?

Upvotes: 1

Views: 1602

Answers (1)

Jay Gong
Jay Gong

Reputation: 23782

could it because my test file is too small that azure failed to read it even though the pipeline ran successfully?

I believe this is not related to the file size because i did test with single row and it works fine for me.

Please try following suggestions:

1.Check the configuration of sink dataset if it is the exactly what you want.

2.Check the preview the data of source dataset if it is correct.

3.Check the monitor log of your pipeline, especially the input size and output size.

enter image description here

4.Try to configure another sink dataset, for example blob storage, to check if the data could be transferred into blob storage sucessfully.

If you have other concern, please feel free to let me know.

Upvotes: 0

Related Questions