codeulike
codeulike

Reputation: 23064

ADF - get table name from dataset

Azure Data Factory dynamic content seems very powerful and very under-documented.

Lets say I have a simple (non-dynamic) dataset pointing to a SQL table:

enter image description here

And then in a Copy Data task I want to reference that table in the Pre-copy script, e.g. to clear out all the data.

enter image description here

How can I pull the table name out of the dataset? Its right there in the Json so surely there's a way?

e.g I see ADF has an activities collection that you can pull things from, e.g. @activity("Activity Name").output ... is there anything similar for pulling properties from datasets? Or from the current Copy Data activity and then get through to the dataset from there?

The official documentation gives a few quick examples but is very light on details. e.g. when you do @activity("Activity Name").output you are hitting some sort of object model but what other objects are available and what are their properties? Anyone got anything on that?

Upvotes: 1

Views: 3591

Answers (2)

wBob
wBob

Reputation: 14389

I know what you mean, I didn't think ADF and its proprietary expression language were that intuitive when I first started working with it. But like most things, it just takes time to get to know it. I spent a lot of time with that article, and did some practice and you can work with the outputs from activities, so say if you had a Lookup that fetched the schema and table name you wanted to work with, then you can reference that in later activities, eg @activity('Lookup1').output.firstRow.someTableName. The easiest way to see what an activity output is, is to run it and look in the Output section; hover over the little 'exit' sign and it shows you the json which you can navigate a bit like an object model, eg value.someVal1.someVal2.

enter image description here

Upvotes: 1

NiharikaMoola
NiharikaMoola

Reputation: 5074

You cannot access the dataset values in your pipeline. As you are hardcoding the table name value in your dataset, you can use the same hardcoded value in your pre-copy script.

Or you can create a dataset parameter and pass the value to the parameter from the pipeline and use the same value in any activities inside the pipeline.

Example:

  1. Create a pipeline variable or parameter.

enter image description here

  1. Create dataset parameter in the dataset.

enter image description here

  1. Pass the variable value to the dataset parameter in the copy data activity sink. Use the same variable in your pre-copy script.

Pre-copy script:

@{concat('Truncate table ',variables('tbname'))}

enter image description here

Upvotes: 1

Related Questions