akkie
akkie

Reputation: 2543

How to transform a JSON data directly in a Azure Data Factory pipeline

I have a lookup in a Azure Data Factory pipeline which is connected to a data flow. The lookup returns a JSON array in the form:

[
    {
        "COLUMN_NAME": "country_code"
    },
    {
        "COLUMN_NAME": "customer_role"
    },
    {
        "COLUMN_NAME": "valid_from"
    },
    {
        "COLUMN_NAME": "valid_to"
    }
]

The data flow accepts an array of strings as parameter. Therefore I need a way to transform the data from the lookup to an array of strings in the form:

[
    "country_code",
    "customer_role",
    "valid_from",
    "valid_to"
]

In my pipeline the output from the lookup is available as pipeline expression: @activity('GetKeyColumns').output.value.

For the data flow I can choose if I will use data flow expression or a pipeline expression to pass the parameter (array of strings). The data flow expression has a map function which is able to map the data structure. But I have no idea how I can use the output from the lookup in this function.

Is there a way how I can achieve this transformation in my pipeline?

Upvotes: 1

Views: 1718

Answers (1)

Steve Johnson
Steve Johnson

Reputation: 8660

You can add a variable, type is array, then use For Each activity and append variable activity.

Step:

1.create a variable named string_array enter image description here

2.create a For Each activity,expression:@activity('GetKeyColumns').output.value enter image description here

3.create a Append variable activity inside For each avtivity,expression:@item()['COLUMN_NAME'] enter image description here

4.pass string_array to data flow by using pipeline expression:@variables('string_array') enter image description here

Upvotes: 1

Related Questions