Reputation: 1083
I am trying to write a query using the BULK INSERT
function in SQL Server 2014 via a python script to insert a large CSV file into a table in a database. I have the following example code:
import pymssql
import sqlalchemy
engine = create_engine('connection_string_goes_here', encoding="utf-8")
table_name = 'my_table'
source_path = 'C:\path\to\csv\test.csv'
q = ("BULK INSERT " + table_name + " " +
"FROM '" + source_path + "' " +
"WITH ( DATAFILETYPE = 'char', " +
"FIELDTERMINATOR = ',', " +
"ROWTERMINATOR = '\\n')")
connection = engine.connect()
connection.execute(q)
connection.close()
The error I get is:
(pymssql.OperationalError) (4861, Cannot bulk load because the file "C:\\path\\to\\csv\\test.csv" could not be opened. Operating system error code 3(The system cannot find the path specified.).DB-Lib error message 20018…
I believe this is because of a double backslash issue. The backslashes are being literally included in the BULK INSERT statement instead of being translated into single backslashes. I don't see anything in the SQL Server error log.
My setup uses MS SQL Server Express 2014 and Python 2.7.9 running on a separate Windows box.
I have tried using raw strings (r'C:\path\to\csv\test.csv
) and the decode
string function on the source_path
, but neither have helped.
Upvotes: 4
Views: 1910