Reputation: 362
I've created a python program that will convert CSV to JSON and insert the JSON data in the oracle database table. CSV to JSON is working fine, I'm able to connect oracle DB, but I'm getting an 'Error while inserting the data: expecting string or bytes object ' error while inserting the data. I'm using executemany function as it provides better performance than execute function. My parsed data is like this below:
[{'index': 0, 'Employee Name': 'Tom', 'Employee Age':35},
{'index': 1, 'Employee Name': 'Jackson', 'Employee Age':31}]
Below is my python code
df= pd.read_excel("demo.xls", sheet_name='a1')
result = df.to_json(orient='table')
parsed = json.loads(result)
try:
conn = oracle.connect("username","password","oracle_service_name")
except Exception as err:
print('Error while creating the connection: ', err)
else:
print('Oracle Version:', conn.version)
try:
cur = conn.cursor()
sql_insert = """INSERT INTO log(record,name,age) VALUES(:0,:1,:2)"""
cur.executemany(sql_insert, parsed['data'])
except Exception as err:
print('Error while inserting the data: ', err)
else:
print('Insert Completed.')
conn.commit()
finally:
cur.close()
conn.close()
Please help me to insert data in the table.
Upvotes: 1
Views: 1671
Reputation: 7086
In your example you are using bind variable names :0, :1 and :2 but the dictionary has names "index", "Employee Name" and "Employee Age". If you use tuples as you suggested in your comment, the code would look as follows:
parsed = [
(0, 'Tom', 35),
(1, 'Jackson', 31)
]
with cx_Oracle.connect("user/password@host/service_name") as conn:
cursor = conn.cursor()
cursor.executemany("insert into log (record, name, age) values (:0, :1, :2)",
parsed)
conn.commit()
This code works. I can also provide a sample with named bind variables (and dictionaries), but generally a list of tuples is the best approach when using batch insert.
Upvotes: 1