YONG BAGJNS
YONG BAGJNS

Reputation: 531

"Invalid parameter type" (numpy.int64) when inserting rows with executemany()

I try to insert bunch of data to database

insert_list = [(1,1,1,1,1,1),(2,2,2,2,2,2),(3,3,3,3,3,3),....] #up to 10000 tuples in this list

conn = pyodbc.connect('DRIVER={FreeTDS};SERVER=xxxxx;DATABASE=xxxx;UID=xx;PWD=xx;TDS_Version=7.0')
cursor = conn.cursor()

sql = "insert into ScanEMAxEMAHistoryDay(SecurityNumber, EMA1, EMA2, CrossType, DayCross, IsLocalMinMax) values (?, ?, ?, ?, ?, ?)"

cursor.executemany(sql, insert_list)

cursor.executemany(sql, insert_list)

pyodbc.ProgrammingError: ('Invalid parameter type. param-index=4 param-type=numpy.int64', 'HY105')

reduce to 100 tuples:

cursor.executemany(sql, insert_list[:100])

cursor.executemany(sql, insert_list[:100])

pyodbc.ProgrammingError: ('Invalid parameter type. param-index=4 param-type=numpy.int64', 'HY105') cursor.executemany(sql, insert_list[:100])

reduce to 5 tuples:

cursor.executemany(sql, insert_list[:5])
conn.commit()

This can insert to database

I have try to:

sql = 'SET GLOBAL max_allowed_packet=50*1024*1024'
cursor.execute(sql)

before excutemany() but it have an error:

pyodbc.ProgrammingError: ('42000', "[42000] [FreeTDS][SQL Server]'GLOBAL' is not a recognized SET option. (195) (SQLExecDirectW)")

How did i solve this.

Thank you.

Upvotes: 21

Views: 45382

Answers (4)

Marcos
Marcos

Reputation: 146

I did the same as Robert; I converted everything to string. In my case, it was a pandas data frame that I casted to string type:

data = pandas.read_json(...)
data.astype(str).to_sql(...)

https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.DataFrame.astype.html

If the data you are retrieving includes URLs, you may get a "unknown protocol" error (or something like that). If you get this error even after casting to string type, try using StringIO instead:

import requests
from io import StringIO
...
data = pandas.read_json(StringIO(response.text))

where response is an instance of object Response from the requests library and its attribute text contains the json text data.

Upvotes: 6

Lateef Mohammed
Lateef Mohammed

Reputation: 1

I had the same error when passing parameters into a SQL INSERT statement using pyodbc. As mentioned by Robert, just converting the variable to string worked.

Example: Syntax - str()

cursor.execute("INSERT INTO Table ([col1],[col2],[col3]...])VALUES(?,'test',?)", str(value1), str(value3))

Upvotes: 0

Robert Franklin
Robert Franklin

Reputation: 299

For anyone out there reading this, it was driving me up the wall.

My eventual solution was to convert all variables to 'str' and it worked fine.

Upvotes: 2

Gord Thompson
Gord Thompson

Reputation: 123829

Your problem is not with the volume of data per se, it is that some of your tuples contain numpy.int64 values that cannot be used directly as parameter values for your SQL statement. For example,

a = numpy.array([10, 11, 12], dtype=numpy.int64)
params = (1, 1, a[1], 1, 1, 1)
crsr.execute(sql, params)

will throw

ProgrammingError: ('Invalid parameter type. param-index=2 param-type=numpy.int64', 'HY105')

because the third parameter value is a numpy.int64 element from your numpy array a. Converting that value with int() will avoid the issue:

a = numpy.array([10, 11, 12], dtype=numpy.int64)
params = (1, 1, int(a[1]), 1, 1, 1)
crsr.execute(sql, params)

By the way, the reason that

sql = 'SET GLOBAL max_allowed_packet=50*1024*1024'
cursor.execute(sql)

didn't work is that max_allowed_packet is a MySQL setting that does not have any meaning for Microsoft SQL Server.

Upvotes: 30

Related Questions