TS74
TS74

Reputation: 61

Azure Data Factory: Get the filename and record number of the offending record

I am trying to ingest a bunch of files and the only error I get in Azure Data Factory is this -

Operation on target Copy isolation_advice_details to SQL failed: ErrorCode=PolybaseOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error happened when loading data into SQL Data Warehouse. Operation: 'Polybase operation'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopExecutionException: Too long string in column [-1]: Actual len = [251]. MaxLEN=[250],Source=.Net SqlClient Data Provider,SqlErrorNumber=107090,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=107090,State=1,

Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopExecutionException: Too long string in column [-1]: Actual len = [251]. MaxLEN=[250],},],'

It is frustrating because there are thousands of files in there. How do I find out which record in which file is this happening?

Upvotes: 0

Views: 633

Answers (2)

TS74
TS74

Reputation: 61

One way to do this is to use "Fault Tolerance" in the "Copy to SQL" settings. This will definitely mitigate the problem. However, it wasn't appropriate for us, as setting fault tolerance on dozens of pipelines would mean a greater level of manual interaction in daily maintenance which our current resourcing was not geared to cope with. It might work for someone else :)

Upvotes: 0

KarthikBhyresh-MT
KarthikBhyresh-MT

Reputation: 5044

This could mostly be with a column with Date or timestamp value. You can ALTER the table with right datatype once you identify one.

Follow this official MS doc ADF throws error: Unexpected error encountered filling record reader buffer ClassCastException for detailed troubleshooting.

Upvotes: 0

Related Questions