Mark Allison
Mark Allison

Reputation: 7228

How to read a local csv file using Azure Data Factory and a self-hosted runtime?

I have a Windows Server VM with the ADF Integration Runtime installed running under a local account called deploy. This account is a member of the local admins group. The server is not domain-joined.

I created a new linked service (File System) and pointed it to a csv file on the root of the C drive as a test. When I test the connection I get Connection failed.

Error occurred when trying to access the file in Folder 'C:\etr.csv', File filter: ''. The directory name is invalid. Activity ID: 1b892702-7cc3-48d5-83c7-c680d6d15afd.

Any ideas on a fix?

screenshot

Upvotes: 2

Views: 2130

Answers (2)

Hyndavi
Hyndavi

Reputation: 96

The dataset represents the structure of the data within the linked data stores, and the linked service defines the connection to the data source. So the linked service should point to the folder instead of file. It should be C:\ instead of C:\etr.csv

Upvotes: 1

igg
igg

Reputation: 2250

The linked service needs to be a folder on the target machine. In your screenshot, change C:\etr.csv to C:\ and then define a new dataset that uses the linked service to select etr.csv.

Upvotes: 2

Related Questions