Akash
Akash

Reputation: 359

How to use dynamic connectors for source data in azure data factory pipeline

I want to create a data pipeline using Azure Data Factory where the source data can come from ADLS, Oracle, S3 depending on how the user chooses to provide the data. What that means is I have to use three different connectors in the pipeline. Once the data is provided in any one of the sources, I want to do data transformation using databricks and again store the output corresponding to the source selected. For example, If I select S3 as a source then my output should also be stored in the same S3 bucket. If I choose the Oracle table as a source then the output should be stored in an oracle table. Basically trying to make a generic pipeline. Is it feasible to do it using ADF and Azure DevOps? Any pointers on how to get started would be really helpful.

Upvotes: 0

Views: 241

Answers (1)

Leon Yue
Leon Yue

Reputation: 16431

No, we can't. Data Factory connector use linked service to connect to the dataset. Different source need different drivers or settings(file path/type, schema and so on). It's impossible for now.

Upvotes: 1

Related Questions