Reputation: 642
i am trying this code
df = pandas.read_sql_table(details)
df.to_sql(details)
meaning i tried to take a table and just put it back in SQL in a new name and i have tried do some manipulation on the table but still i got the same error:
programmingError
\sqlalchemy\engine\base.py in _execute_context(self , dialect , constructor , statement , parameters , *args)
line 1192 parameters ,
line -> 1193 context)
line 1194 except BaseException as e
..\enging\defaulte.py
line 506 def do_execute(self , cursor , statment , parmetrs , context)
line 507 cursor.execute(statment , parametrs)
and programmingError : ('the SQL containe 12366 parameters marks , but 77902 parameters were supplied' , 'HY000')
and
DBAPIError :(pyodoc.Error) ('[07002] [Microsoft][SQL Server Native Client 11.0]COUNT field incorrect or syntax error (0) (SQLExecDirectw) [SQL : 'INSERT INTO [testName] ([index] , [col1] , [col2] ... [colN]) VALUES (? , ? , ? ,...?) , (? , ? , ? ,...?) , (? , ? , ? ,...?) , (? , ? , ? ,...?) ... (? , ? , ? ,...?)]')
the number of ? , ? is N times as the col number and the (? , ? , ? ,...?) is as the number of the rows.
when i tried the same code on smaller table or when i dived this table two table with less the 7 cols this load and update perfectly.
Upvotes: 1
Views: 363
Reputation: 64959
I'm guessing you've hit an internal limit on the number of bind parameters in a SQL statement within SQL Server or the driver used to connect to it. Interestingly, 77902 - 12366 == 65536 == 2**16
, which makes me think that a 16-bit integer somewhere has overflowed.
Fortunately there seems to be a parameter in the to_sql
method that you can use to break the SQL statements into chunks rather than sending the lot in one go. From the Pandas documentation for to_sql
:
chunksize : int, optional
Rows will be written in batches of this size at a time. By default, all rows will be written at once.
The total number of parameters (77902) factorises as 2 × 11 × 3541, with 3541 being prime. It looks like your table has either 11 columns and 7082 rows or 22 columns and 3541 rows. To avoid this error I would recommend keeping the number of bind parameters in a chunk below 32767 just in case the 16-bit integer I suspect to be causing this problem is signed, so chunksize=1000
would probably work for you. At the same time, however, this is unlikely to give much faster performance than chunksize=100
so it's not worth spending that much time trying to find the largest chunk size that will work.
Upvotes: 1