Reputation: 1675
I have a database in SQL Server with a lot of tables and wish to export all tables in csv format. From a very similar question asked previously - Export from SQL Server 2012 to .CSV through Management Studio
Right click on your database in management studio and choose Tasks -> Export Data...
Follow a wizard, and in destination part choose 'Flat File Destination'. Type your file name and choose your options.
What I want is the capability to export all tables at once. The SQL Server Import and Export Wizard only permits one table at a time. This is pretty cumbersome, if you have a very big database. I think a simpler solution might involve writing a query, but not sure.
Upvotes: 44
Views: 64425
Reputation: 2176
I had to recently export lots of tables from SQL Server to csv, and some columns contain binary data, I came up with a python script with dependency no pyodbc that does the job. After that I was able to import those tables to a postgres database.
Posting this because presented answers doesn't handle binary data.
Export script:
import os
import pyodbc
import argparse
def parse_args():
parser = argparse.ArgumentParser(description="bulk table export")
parser.add_argument("--server", "-s", metavar="server", required=True, type=str, help="sql server host")
parser.add_argument("--db", "-d", metavar="db", required=True, type=str, help="db name")
parser.add_argument("--output", "-o", metavar="csv", required=True, type=str, help="where to export")
return parser.parse_args()
def connect(server, db):
return pyodbc.connect(f"Driver={{SQL Server}};Server={server};Database={db};Trusted_Connection=yes;")
if __name__ == '__main__':
args = parse_args()
with connect(args.server, args.db) as conn:
with conn.cursor() as cursor:
# get all tables for export
query = """
SELECT s.name as schemaName, t.name as tableName
FROM sys.tables t
INNER JOIN sys.schemas s ON t.schema_id = s.schema_id
"""
tables = [(s, t) for (s, t) in cursor.execute(query) if s == "dbo"]
export_folder = os.path.join(args.output, args.db);
if not os.path.exists(export_folder):
os.makedirs(export_folder)
# start exporting
for (schema, table) in tables:
path = os.path.join(export_folder, f"{table.lower()}.csv")
with open(path, "w", encoding="utf-8") as f:
print(f"writing table '{table}'")
# write header
rows = cursor.execute(f"SELECT * FROM [{schema}].[{table}]")
f.write(",".join([column[0] for column in cursor.description]) + "\n")
# fill all rows
for row in rows:
r = []
for cell in row:
if type(cell) == bytes:
# convert bytes to oct number
# handling here binary data
# this one is handled specifically for postgres csv import (oct numbers)
# you can handle it here as you wish
byte_string = []
for b in cell:
b_str = oct(b).replace("0o", "")
byte_string.append("\\" + f"{b_str:0>3}")
byte_str = "".join(byte_string)
r.append(f"${byte_str}$")
elif cell is None:
r.append("")
else:
r.append(f"${cell}$")
f.write(",".join(r) + "\n")
Example usage:
python csv_export.py ^
--output "C:\amazing\path\csv" ^
--server "AMAZING_SERVER_NAME" ^
--db "MY_VERY_IMPORTANT_DB"
Upvotes: 0
Reputation: 1960
The export wizard allows only one at a time. I used the powershell script to export all my tables into csv. Please try this if it helps you.
$server = "SERVERNAME\INSTANCE"
$database = "DATABASE_NAME"
$tablequery = "SELECT s.name as schemaName, t.name as tableName from sys.tables t inner join sys.schemas s ON t.schema_id = s.schema_id"
#Delcare Connection Variables
$connectionTemplate = "Data Source={0};Integrated Security=SSPI;Initial Catalog={1};"
$connectionString = [string]::Format($connectionTemplate, $server, $database)
$connection = New-Object System.Data.SqlClient.SqlConnection
$connection.ConnectionString = $connectionString
$command = New-Object System.Data.SqlClient.SqlCommand
$command.CommandText = $tablequery
$command.Connection = $connection
#Load up the Tables in a dataset
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
# Loop through all tables and export a CSV of the Table Data
foreach ($Row in $DataSet.Tables[0].Rows)
{
$queryData = "SELECT * FROM [$($Row[0])].[$($Row[1])]"
#Specify the output location of your dump file
$extractFile = "C:\mssql\export\$($Row[0])_$($Row[1]).csv"
$command.CommandText = $queryData
$command.Connection = $connection
$SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
$SqlAdapter.SelectCommand = $command
$DataSet = New-Object System.Data.DataSet
$SqlAdapter.Fill($DataSet)
$connection.Close()
$DataSet.Tables[0] | Export-Csv $extractFile -NoTypeInformation
}
Thanks
Upvotes: 97
Reputation: 1464
Simple one line solution:
Open Microsoft SQL Server Management studio (SSMS), connect to your DB and run the following:
SELECT 'sqlcmd -s, -W -Q "set nocount on; select * from ['+DB_NAME()+'].[dbo].[' + st.NAME + ']" | findstr /v /c:"-" /b > "c:\target\' +st.NAME +'.csv'
FROM sys.tables st
create a folder c:\target and open a cmd console in that folder and select & copy all the results of the SQL above and paste them into a cmd window
that's it, all the csv files will be created there.
The csv files will be generated each with the first line as the column names
Upvotes: 6
Reputation: 1264
The answer by sree is great. For my db, because there are multiple schemas, I changed this:
$tablequery = "SELECT schemas.name as schemaName, tables.name as tableName from sys.tables inner join sys.schemas ON tables.schema_id = schemas.schema_id"
and then also
$queryData = "SELECT * FROM [$($Row[0])].[$($Row[1])]"
#Specify the output location of your dump file
$extractFile = "C:\mssql\export\$($Row[0])_$($Row[1]).csv"
Upvotes: 7
Reputation: 49570
You can replicate the schema to a temp database and then use that database to export all tables (along with column names).
Steps:
1) First, create a script for all the tables:
Tasks->Generate Scripts->all tables->single sql file
2) Create a dummy database and run this script.
3) right click on the dummy database and select tasks->export data-> select source data->select destination as Microsoft Excel and give its path->Execute
You'll get a single spreadsheet with all the tables and its columns.
Execute below query, it'll give all table names in 1st column along with corresponding column names in 2nd column of result.
select t.name ,c.name from sys.columns c
inner join sys.tables t
on t.object_id = c.object_id
order by t.name
Copy this result and paste it in CSV.
Upvotes: 0
Reputation: 71
Comment on @annem-srinivas solution: If you use schemas other than the default (dbo), change the following in his script:
$tablequery = "SELECT schemas.name as schemaName, tables.name AS tableName from sys.tables INNER JOIN sys.schemas ON schemas.schema_id
= tables.schema_id ORDER BY schemas.name, tables.name"
and
$queryData = "SELECT * FROM [$($Row[0])].[$($Row[1])]"
$extractFile = "C:\temp\export\$($Row[0]).$($Row[1]).csv"
Upvotes: 2
Reputation: 13949
Instead of clicking Export Data
, choose Generate Scripts
. Select the tables you want, click next and click the Advanced
button. The last option under General
is Types of data to script
. Chose Schema and data
or just Data
.
Upvotes: 16