altruic
altruic

Reputation: 364

how can I get pyodbc to perform a "SELECT ... INTO" statement without locking?

I'm trying to copy a table in SQL Server, but a simple statement seems to be locking my database when using pyodbc. Here's the code I'm trying:

dbCxn = db.connect(cxnString)
dbCursor = dbCxn.cursor()
query = """\
SELECT TOP(10) *
INTO production_data_adjusted
FROM production_data
"""
dbCursor.execute(query)

The last statement returns immediately, but both LINQPad and SQL Server Management Studio are locked out of the database afterwards (I try to refresh their table lists). Running sp_who2 shows that LINQPad/SSMS are stuck waiting for my pyodbc process. Other databases on the server seem fine, but all access to this database gets held up. The only way I can get these other applications to resolve their stalls is by closing the pyodbc database connection:

dbCxn.close()

This exact same SELECT ... INTO statement statement works fine and takes only a second from LINQPad and SSMS. The above code works fine and doesn't lock the database if I remove the INTO line. It even returns results if I add fetchone() or fetchall().

Can anyone tell me what I'm doing wrong here?

Upvotes: 2

Views: 3503

Answers (1)

Bryan
Bryan

Reputation: 17703

Call the commit function of either the cursor or connection after SELECT ... INTO is executed, for example:

...
dbCursor.execute(query)
dbCursor.commit()

Alternatively, automatic commit of transactions can be specified when the connection is created using autocommit. Note that autocommit is an argument to the connect function, not a connection string attribute, for example:

...
dbCxn = db.connect(cxnString, autocommit=True)
...

Upvotes: 1

Related Questions