Reputation: 43
This is my file pattern: adm_domain_20180401, adm_domain_20180402, these are from one particular source. same folder also contains adm_agent_20180401, adm_agent_20180402. I want to only copy files from blob to ADL with prefix adm_domain, is there any way to define the file pattern in input data set?
DATASET:
{ "name": "CgAdmDomain", "properties": { "published": false, "type": "AzureBlob", "linkedServiceName": "flk_blob_dev_ls", "typeProperties": { "folderPath": "incoming/{Date}/", "format": { "type": "TextFormat" }, "partitionedBy": [ { "name": "Date", "value": { "type": "DateTime", "date": "SliceStart", "format": "yyyyMMdd" } } ] }, "availability": { "frequency": "Minute", "interval": 15 }, "external": true, "policy": {} } }
Upvotes: 0
Views: 364
Reputation: 2490
The fileFilter
is not available for Azure Blob Storage
. If you are looking files at on-premise then you will be able to achieve this by specifying a filter to be used to select a subset of files in the folderPath
rather than all files - link
To solely achieve this for Azure Blob Storage
use Azure Data Factory
Custom activities. Implement the logic through custom code (.NET) and have it as an activity in the pipeline. More info about how to use custom activites - further reading.
Upvotes: 0
Reputation: 266
Are you using ADF V1 or V2? We are working on adding filename wildcard support in ADF V2.
Upvotes: 0