Reputation: 85
Am trying to read Clob data from test db and inserting it in dev db. Am able to do that but the performance is very poor. For 100K rows its taking 8 to 12 hours, and running it from my local machine. Am wondering if my approach is correct or is there any better way in doing it. Below is my code after connections:
for row in rows.fetchall()
x = []
data = row.read
json_data = json.loads(data)
x.append(json_data)
This is how am doing it. Just wanted to know if there is any better way to do it. Stack : Python, OracleDB, cx_oracle, json Thanks
Upvotes: 2
Views: 11623
Reputation: 7086
From the cx_Oracle sample the following code is what you want to use. This should dramatically improve performance!
def OutputTypeHandler(cursor, name, defaultType, size, precision, scale):
if defaultType == cx_Oracle.CLOB:
return cursor.var(cx_Oracle.LONG_STRING, arraysize = cursor.arraysize)
elif defaultType == cx_Oracle.BLOB:
return cursor.var(cx_Oracle.LONG_BINARY, arraysize = cursor.arraysize)
conn = cx_Oracle.Connection("user/pw@dsn")
conn.outputtypehandler = OutputTypeHandler
cursor = conn.cursor()
cursor.execute("""
select CLOBColumn
from SomeTable""")
json_data = [json.loads(s) for s, in cursor]
Upvotes: 2