Reputation: 177
I am running into an issue when trying to parse the data from a config file in Data Factory. I am using a configuration file and the items are called in the copy activity. We have the option to parameterize the 'Column Delimiter' field from a data set, so I am using the value from the file (because in some cases is ';' and in others '\t'). When the delimiter is semicolon is working perfectly, but when it's \t , I get the following error :
Copy activity doesn't support multi-char or none column delimiter.
When I'm checking the value that goes into the field, I see that the value is not the one from the file (\t
), but \\t
.
Do you have any idea why this behavior or if there is an escape character for this one. I also tried with ASCII code (\0009
) and I get the same error - it doesn't know to transform it. Thanks a lot!
Upvotes: 0
Views: 3586
Reputation: 1
All answers didn't work for me too. In my case, I change the value on the corresponding dataset
https://www.youtube.com/watch?v=-yHCaqkeJy4
Upvotes: -1
Reputation: 154
I think I found a working solution for this issue. All answers until now didn't work for me.
Click on "Add dynamic content" under the delimiter parameter textbox to open the "Pipeline expression builder".
Since the tab in base64 is CQ== we can use the following function:
@base64ToString('CQ==')
Upvotes: 0
Reputation: 1
You should use t instead of \t. Data Factory replaces t with \t itself. That is why \t ends up as \t
Upvotes: 0
Reputation: 11
The short answer, is when entering a tab value in the UI, do not use \t
, instead use " "
.
Between the empty quotes, I pasted an actual tab character.
Upvotes: 1
Reputation: 192
Can you try passing a real tab copied from a text editor, like - ' '.
This has been seen to work. Had there been no parameterization in the delimiter, you could have done it through the GUI or even the code.
Upvotes: 2
Reputation: 23782
Based on the statements in the official document, Currently, multi-char delimiter is only supported for mapping data flow
but not Copy activity
.
You could try to use mapping data flows
which is also designed data transformations in ADF. Please see more details here: https://learn.microsoft.com/en-us/azure/data-factory/concepts-data-flow-overview
Any concern,please let me know.
Upvotes: 0