Reputation: 61
We have recently picked up Azure Data Factory as a replacement for SSIS packages in our data flow processing. One of the things I am exploring how to do is having a pipeline that:
Kicks off a stored procedure in Snowflake - Done
Returns the table of data from Snowflake once the proc is complete - Done
Output that table of data into a csv file into a network location - struggling
Not sure how or where to begin with this if its even possible? I assume so because of the nature of the task, but if anyone can recommend any good guides or provide some help on how to complete that final task please? If its even possible?
Currently exploring how to move the data into a blob storage, is that the right first step, or can it just go straight to a file in a local / network location?
Upvotes: 0
Views: 350
Reputation: 81
Pushing the csv file into a local location is possible. Let’s say for example, the local location is your machine itself - you would first need to setup and install a self hosted intergration runtime on your machine so that your local machine can do a handshake with ADF directly. Next, create your linked service based on a file system and then a dataset file system csv - delimited txt.
Upvotes: 0