Reputation: 13
I was using the copy activity for updating rows to azure table storage. Currently the pipeline fails if there are any errors in updating any of the rows/batches.
Is there a way to gracefully handle the failed rows and continue with the copy activity for the rest of the data?
I already tried the Fault Tolerance option which the copy activity provides but that does not solve this case.
Upvotes: 1
Views: 1505
Reputation: 7126
Source dataset
Fault tolerance settings
Error Message:
In copy activity, it is not possible to skip the incompatible rows other than using fault tolerance. Workaround is to use dataflow activity and separate the compatible rows and incompatible rows and then copy the compatible data using copy activity. Below is the approach.
col4
needs to be checked before loading to Table storage, Condition is given on col4 data using condition activity. Conditional split Transformation is added after source transformation. Condition is given as,
FalseStream :
like(col4,'%#%')||like(col4,'%$%')||like(col4,'%/%')||like(col4,'%\\%')
**Sample characters are given in the above condition. **
True Stream will be the rows which do not match the above condition.
True Stream Data:
Upvotes: 2