proteus
proteus

Reputation: 555

import function from .csv file to azure db

I need to build some functionality that allows users to select a .csv file and upload it (the data) to my database through an azure web app. Since theres no file system available, how do I get the .csv data into a blob in my sql server db ? are there any examples out there ?

Upvotes: 0

Views: 1632

Answers (2)

ShirleyWang-MSFT
ShirleyWang-MSFT

Reputation: 286

If you need to regularly upload .CSV files from on-prem file system to Azure, you can consider using Azure Data Factory (ADF). ADF is a cloud-based and fully managed data integration service, with the capability of ingesting data from 30 different data sources in different formats and land into any of the Azure data storage services (Blob, SQL DB, SQL DW, ADLS, Document DB, Azure Search). Here is an example of how to configure an ADF pipeline to upload file from on-prem to Azure Blob: https://learn.microsoft.com/en-us/azure/data-factory/data-factory-onprem-file-system-connector#json-examples-for-copying-data-to-and-from-file-system. We strongly recommend using the Copy Wizard which is a GUI-based authoring wizard instead of using JSON. See this tutorial for a quick walkthrough on creating a pipeline using the Copy data wizard: https://learn.microsoft.com/en-us/azure/data-factory/data-factory-copy-data-wizard-tutorial

Upvotes: 0

Bruce Chen
Bruce Chen

Reputation: 18465

Since theres no file system available, how do I get the .csv data into a blob in my sql server db ?

If I understood you correctly, you could save the uploaded csv file into a temp folder with your web content or use File System from Azure Web App (e.g. d:\home or d:\local) to temporarily store the csv file. Then you could leverage SqlBulkCopy and this CSV Reader library for a simple way to import the csv data into your sql table. Here is the code snippet, you could refer to it:

string connectionString = "Data Source=tcp:{server-name}.database.windows.net,1433;Initial Catalog={db-name};User ID={user-name};Password={pwd}";
string filename = "{the-path-of-your-csv-file-in-webapp}";
using (var conn = new SqlConnection(connectionString))
{
    conn.Open();
    var transaction = conn.BeginTransaction();
    try
    {
        using (StreamReader file = new StreamReader(filename))
        {
            CsvReader csv = new CsvReader(file, true, ',');
            SqlBulkCopy copy = new SqlBulkCopy(conn, SqlBulkCopyOptions.KeepIdentity, transaction);
            copy.DestinationTableName = "TableForBulkData";
            copy.WriteToServer(csv);
            transaction.Commit();
        }
    }
    catch (Exception ex)
    {
        transaction.Rollback();
    }
}

Result

enter image description here

UPDATE

As Azure Web App sandbox mentioned about File System Restrictions/Considerations:

Every Azure Web App has a home directory stored/backed by Azure Storage. This network share is where applications store their content. This directory is available for the sandbox with read/write access.

As a convenience for our customers, the sandbox implements a dynamic symbolic link in kernel mode which maps d:\home to the customer home directory. This is done to remove the need of the customer to keep referencing their own network share path when accessing the site. No matter where the site runs, or how many sites run on a VM, each can access their home directory using d:\home.

You could leverage KUDU for a simple way to access your web app content as follows:

enter image description here

Upvotes: 1

Related Questions