samuel oyeleye
samuel oyeleye

Reputation: 85

Importing JSON data file into PostgreSQL using Python and Psycopg2

I am having trouble getting my query to work. I have a JSON file with over 80k lines of data. Since I have been having so many problems I cut the document down to three lines just to see if I can get the data in before I attempt the full 80k lines:

Import psycopg2
import io
readTest1 = io.open("C:\Users\Samuel\Dropbox\Work\Python and Postgres\test1.json", encoding = "utf-8")
readAll = readTest1.readlines()

I have seen online that using readlines is not the best method but it is the only method i know. This method read the three lines in the file. I am not sure but I expected this to make it an array also.

conn = psycopg2.connect("dbname = python_trial user = postgres")
cur = conn.cursor()
cur.execute("CREATE TABLE test4 (data json);")

Create a table that only takes JSON data:

cur.executemany("INSERT INTO test4 VALUES (%s)", readAll)

The error:

Traceback (most recent call last):
File "<pyshell#13>", line 1, in <module>
cur.executemany("INSERT INTO test4 VALUES (%s)", readAll)
TypeError: not all arguments converted during string formatting

I am not exactly sure what I am doing incorrectly. I am also seeing "\n" when i print (readAll). I think that is caused by using the readlines method and I am not sure if that is messing up my query also.

Upvotes: 1

Views: 5183

Answers (1)

Eduard Golubov
Eduard Golubov

Reputation: 96

Use this:

cur.executemany("INSERT INTO test4 VALUES ('{0}')".format(readAll))

Or (readAll,): cur.executemany("INSERT INTO test4 VALUES (%s)", (readAll,))

See: http://initd.org/psycopg/docs/usage.html#passing-parameters-to-sql-queries

Upvotes: 1

Related Questions