Naz
Naz

Reputation: 21

How to automatically export data from SQL Server 2012 to CSV file?

Hope I dont upset anybody by asking too simple a question!

I have a requirement to export data from a SQL Server 2012 table, to a CSV file. This needs to be done either every hour, or ideally if it is possible, whenever a new record is created or an existing record is updated/deleted. The table contains a list of all Sites we maintain. I need to export this CSV file to a particular location, as there is an API from a third party database which monitors this location and imports CSV files from there.

The data to be extracted from SQL is:

Mxmservsite.siteid as Marker_ID, mxmservsite.name as Name, 'SITE' as Group, '3' as Status, 
'' as Notes, mxmservsite.zipcode as Post_Code, 'GB' as Country, '' as Latitude, 
'' as Longitude, '' as Delete
Where dataareaid='ansa'

Anyone have any clues how I can go about doing this? Sorry, I am a newbie with SQL and still learning the basics! I have searched for similar questions in the past, but havent found anything. I know there is a utility called BCP, but not sure whether that would be the best way, and if it would be, then how do I use it to run every hour, or whenever there is a record update/delete/insert?

Cheers

Upvotes: 2

Views: 12916

Answers (3)

Swerdna
Swerdna

Reputation: 1

You'll need to specify the Server Name that you are currently on. You're not able to use a drive letter using D$, but need to use a Shared drive name. The following works in 2012.

-- Declare report variables
DECLARE @REPORT_DIR VARCHAR(4000)
DECLARE @REPORT_FILE VARCHAR(100)
DECLARE @DATETIME_STAMP VARCHAR(14)
DECLARE @Statement VARCHAR(4000)
DECLARE @Command VARCHAR(4000)

--SET variables for the Report File
SET @DATETIME_STAMP = (SELECT CONVERT(VARCHAR(10), GETDATE(), 112) + REPLACE(CONVERT(VARCHAR(8), GETDATE(),108),':','')) -- Date Time Stamp with YYYYMMDDHHMMSS
SET @REPORT_DIR = '\\aServerName\SharedDirectory\' -- Setting where to send the report. The Server name and a Shared name, not a drive letter
SET @REPORT_FILE = @REPORT_DIR + 'Tables_' + @DATETIME_STAMP + '.csv' --the -t below is used for the csv file

--Create the CSV file report with all of the data. The @Statement variable must be used to use variables in the xp_cmdshell command.
SET @Statement = '"SELECT * FROM  sys.tables" queryout "'+@REPORT_FILE+'" -c -t"," -r"\n" -S"CurrentServerName\Databasename" -T' --The -S must be used with the -T
SET @Command = 'bcp '+@Statement+' '
EXEC master..xp_cmdshell @Command 

Upvotes: 0

Greg
Greg

Reputation: 4045

Another option - create a sql agent job that runs bcp.exe command to do the export for you, at any interval you want (every hour). With bcp.exe, you can specify your file location, column/row terminators, and the filtering query.

If you want to export at every change, you can add an after trigger as mentioned above, and simply exec the sql agent job, which will execute asynchronously. If you are concerned about performance, then you should test it out to understand the impact.

If you like @John's powershell script, stick it in a sql agent job and schedule it, if anything to keep all your SQL tasks centralized.

Upvotes: 0

JohnLBevan
JohnLBevan

Reputation: 24430

Here's some powershell that would do what you're after; just schedule it using the Windows Task Scheduler:

function Execute-SQLQuery {
    [CmdletBinding()]
    param (
        [Parameter(Mandatory = $true)]
        [string]$DbInstance
        ,
        [Parameter(Mandatory = $true)]
        [string]$DbCatalog
        ,
        [Parameter(Mandatory = $true)]
        [string]$Query
        ,
        [Parameter(Mandatory = $false)]
        [int]$CommandTimeoutSeconds = 30 #this is the SQL default
    )
    begin {
        write-verbose "Call to 'Execute-SQLQuery': BEGIN"
        $connectionString = ("Server={0};Database={1};Integrated Security=True;" -f $DbInstance,$DbCatalog)
        $connection = New-Object System.Data.SqlClient.SqlConnection
        $connection.ConnectionString = $connectionString
        $connection.Open()    
    }
    process {
        write-verbose "`n`n`n-----------------------------------------"
        write-verbose "Call to 'Execute-SQLQuery': PROCESS"
        write-verbose $query 
        write-verbose "-----------------------------------------`n`n`n"
        $command = $connection.CreateCommand()
        $command.CommandTimeout = $CommandTimeoutSeconds
        $command.CommandText = $query
        $result = $command.ExecuteReader()
        $table = new-object “System.Data.DataTable”
        $table.Load($result)
        Write-Output $table
    }
    end {
        write-verbose "Call to 'Execute-SQLQuery': END"
        $connection.Close()
    }
}

Execute-SQLQuery -DbInstance 'myServer\InstanceName' -DbCatalog 'myDatabase' -Query @"
select Mxmservsite.siteid as Marker_ID
 , mxmservsite.name as Name
 , 'SITE' as Group
 , '3' as Status
 , '' as Notes
 , mxmservsite.zipcode as Post_Code
 , 'GB' as Country
 , '' as Latitude
 , '' as Longitude
 , '' as Delete
 From mxmservsite --this wasn't in your original code
 Where dataareaid='ansa'
 "@ | Export-CSV '.\MyOutputFile.csv' -NoType 

To have something triggered on any change is possible; i.e. you could create a trigger on the table, then use xp_cmdshell to execute a script or similar; but that's going to lead to performance problems (triggers are often a bad option if used without being fully understood). Also xp_cmdshell opens you up to some security risks.

There are many other ways to achieve this; currently I have a thing for PowerShell as it gives you loads of flexibility with little overhead.

Another option may be to look into using linked servers to allow your source database to directly update the target without need for CSV.

Upvotes: 2

Related Questions