Reputation: 472
Would be interested if someone can point me in the right direction.
I have one very long stored proc (that also contains calls to other stored procs within it) that updates various tables.
If I run the stored proc in management studio it runs fine. If I call it from pyodbc then:
I run pyodbc to execute stored procedures all the time and have no problems - I know there is nothing wrong with my connections or calls as if I substitute a shorter stored proc into the python code in the same place it works fine.
The stored proc does generate a few "Warning: Null value is eliminated by an aggregate or other SET operation" messages, I thought that these might be causing problems but whenever I try to SET ANSI_WARNINGS { ON | OFF } either inside the stored proc or outside the stored proc I got a pyodbc.ProgrammingError
Any guesses as to the problem?
Python 3.4 (have the same problem in 2.7), MSSQL, Windows 7
UPDATED:
import pyodbc as p
def getconn():
server='insertsqlservername'
dbase='insertdbasename'
connStr=('Driver={SQL Server};SERVER=' +
server + ';DATABASE=' + dbase + ';' +
'trusted=1')
conn = p.connect(connStr)
return conn
def runSQL():
conn=getconn()
cursor=conn.cursor()
try:
cursor.execute('exec InsertStoredProcName')
conn.commit()
except:
print('sys.exc_info()[0])
cursor.close()
conn.close()
Just to be 100% on the problem there seem to be two components:
It seems to be something about the combination of the two.
Upvotes: 3
Views: 3412
Reputation: 14361
Can you include your pyodbc connection string? It sounds like you're not setting autocommit when you connect, which causes the changes you make to be rolled back when the connection is closed. You have two options, if this is the case. First, you can set autocommit to be on when you connect:
conn = pyodbc.connect(connection_str, autocommit=True)
Or, before you close the connection:
conn.commit()
Upvotes: 5