Reputation: 521
I have a python file and multiple JSON files(Around 20 files) in my directory. I have a postgresql schema defined as well. How do I Parse the json files and store the values in my Postgresql table? Here is my schema .sql file
CREATE TABLE IF NOT EXISTS node_data_table (
id AUTOINCREMENT,
block_producer_key VARCHAR,
recieved_at timestamp,
online_status BOOLEAN DEFAULT TRUE,
added_time timestamp,
PRIMARY KEY(block_producer_key, recieved_at)
);
And here is my python file.
import json
import glob
from sqlalchemy import create_engine
db_name = 'playground'
db_user = 'postgres'
db_pass = 'username@1'
db_host = 'db'
db_port = '5432'
db_string = 'postgres://{}:{}@{}:{}/{}'.format(db_user, db_pass, db_host, db_port, db_name)
db = create_engine(db_string)
for filename in glob.glob("2021-03-05.*.json"):
with open(filename) as json_data:
data = json.load(json_data)
columns = list(data.keys())
print('Column names', columns)
The output:
Column names ['receivedAt', 'receivedFrom', 'blockProducerKey']
What I am trying to do here is getting a list of column names. I want values under these column names to be stored in the postgresql database. How do I do that? I am aware that it needs to be dumped using json.dumps
but how do I intereact with the postgresql database? Please help!
Upvotes: 1
Views: 1079
Reputation: 148870
Ok, you already have an sqlachemy engine. You could just use a connection and insert your data:
db = create_engine(db_string)
conn = db.connect()
with conn.begin(): # use a single transaction
for filename in glob.glob("2021-03-05.*.json"):
with open(filename) as json_data:
data = json.load(json_data)
conn.execute("""INSERT INTO node_data_table (block_producer_key,recieved_at)
VALUES(:block_producer_key, :recieved_at)""",
{k, v for k,v in data.items() if k in ('block_producer_key',
'recieved_at')})
Upvotes: 1