Paul
Paul

Reputation: 57

Insert JSON into Postgres 9.5

In Python 3.6 I have a variable 'data2' which contains a JSON that looks like this:

{
    'id': 4573457, 'account_id': 456, 'address': '15 Millers Rd, WA', 
    'category_id': 4565, 'description': None, 'is_anonymous': False, 
    'iso_created_at': '2017-11-21T14:08:54+11:00', 
    'location': {
        'latitude': -56.564848493, 'longitude': 345.5948493}, 
    'report_state_id': 45655, 'report_state_name': 'ALL PICKED', 
    'title': 'South', 'user_id': 44555, 'user_short_name': 'Todd G.', 
    'users_alerted_count': 0, 'users_opened_count': 6, 'shape_id': 56
}

I want to write an insert statement to insert this data into a table I have already created in Postgres 9.5, the table has 3 columns - channel, report_id and report_data -

I want to insert the 'account_id' from the JSON into the channel column, the 'id' into the report_id column and the rest of the JSON all into the report_data column.

Can someone please tell me how I can do this?

Upvotes: 3

Views: 8637

Answers (3)

Ferris
Ferris

Reputation: 5601

The answer above is somewhat complicated.

From the psycopg2 office document, there is class type call psycopg2.extras.Json which is an ISQLQuote wrapper to adapt a Python object to json data type.

import psycopg2
import logging
from psycopg2.extras import Json

conn = psycopg2.connect(dbname="  ", 
                        user="  ",
                        password="  ", 
                        host="127.0.0.1", 
                        port="5432")
data2 = {
    'id': 4573457, 'account_id': 456, 'address': '15 Millers Rd, WA', 
    'category_id': 4565, 'description': None, 'is_anonymous': False, 
    'iso_created_at': '2017-11-21T14:08:54+11:00', 
    'location': {
        'latitude': -56.564848493, 'longitude': 345.5948493}, 
    'report_state_id': 45655, 'report_state_name': 'ALL PICKED', 
    'title': 'South', 'user_id': 44555, 'user_short_name': 'Todd G.', 
    'users_alerted_count': 0, 'users_opened_count': 6, 'shape_id': 56
}

item = {
    'channel': data2['account_id'],
    'report_id': data2['id'],
    'report_data': Json(dict([(k, v) for k, v in data2.items() if k not in ['account_id', 'id']]))
}



def sql_insert(tableName, data_dict):
    '''
        INSERT INTO onetable (channel,  report_id,  report_data)
        VALUES (%(channel)s, %(report_id)s, %(report_data)s );
    '''
    sql = '''
        INSERT INTO %s (%s)
        VALUES (%%(%s)s );
        '''   % (tableName, ',  '.join(data_dict),  ')s, %('.join(data_dict))
    return sql

tableName = 'onetable'
sql = sql_insert(tableName, item)
try:
    with conn.cursor() as cur:
        cur.execute(sql, item)
    conn.commit()
except Exception as e:
    logging.debug(e)
    conn.rollback()
finally:
    conn.close()

Upvotes: 1

ravioli
ravioli

Reputation: 3823

Here's the basic way to extract the JSON values in Postgres:

  data2->'account_id' AS channel

So, your execute SQL should be something like this:

cursor.execute("
    INSERT INTO MyTable (channel, report_id, report_data)
    SELECT 
      src.MyJSON->'account_id',
      src.MyJSON->'id',
      src.MyJSON
    FROM (
      SELECT %s AS MyJSON
    ) src
  ",
  (data2,)
)

If you want to remove the account_id/id keys before inserting the rest of the JSON into the report_data field, then you can create a second "data2" variable (i.e. "data2_final" with the removed keys and also pass that as a parameter to your SQL. Let me know how it works for you.

Updated

CREATE TABLE Mytable (
  channel INTEGER, 
  report_id INTEGER, 
  report_data JSONB
);

cursor.execute("
    INSERT INTO MyTable (channel, report_id, report_data)
    SELECT 
      CAST(src.MyJSON->>'account_id' AS INTEGER),
      CAST(src.MyJSON->>'id' AS INTEGER),
      src.MyJSON
    FROM (
      SELECT CAST(%s AS JSONB) AS MyJSON
    ) src
  ",
  (data2,)
)

http://www.sqlfiddle.com/#!17/fb3af/1

I updated the extract to return the JSON values as text and then cast them as INTEGER.

Updated Update I formatted your code below as I understood it with the changes I made noted below:

def calldb( db, sql_cmd): 
    try: 
        cur = db.cursor() 
        cur.execute(sql_cmd, (data2,)) 
        return 
    except Exception as e: 
        print ('Error ', e ) 
        raise 

sql_cmd=" INSERT INTO MyTable (channel, report_id, report_data) SELECT CAST(src.MyJSON->>'account_id' AS INTEGER), CAST(src.MyJSON->>'id' AS INTEGER), src.MyJSON FROM ( SELECT CAST(%s AS JSONB) AS MyJSON ) src" 
calldb(conn, sql_cmd) 
conn.commit()

Changes:

  • removed the extra double-quotes at start and end of sql_cmd
  • added a double-quote after "src" in the query
  • moved the (data2,) tuple to the cur.execute() call

The way the execute() function works, is you pass it the SQL string you want to execute (i.e. sql_cmd) as the first parameter. The %s thingies in the string are placeholders for the parameterized values. As a second argument, you pass an array/tuple containing the parameter values (i.e. (data2,)).

Fingers crossed :)

Updated Updated Update
Here is the working code (slightly modified version from what you provided):

import psycopg2
import json

def calldb(db, sql_cmd, sql_params): 
    try: 
        cur = db.cursor()
        cur.execute(sql_cmd, sql_params)
        return
    except Exception as e: 
        print ('Error ', e ) 
        raise 

params = {
  "host":"DB_HOSTNAME",
  "database":"DB_NAME",
  "user":"USERNAME",
  "password":"PASSWORD"
}

conn = psycopg2.connect(**params)

# Prepare SQL
sql_cmd = "INSERT INTO MyTable (channel, report_id, report_data) SELECT CAST(src.MyJSON->>'account_id' AS INTEGER), CAST(src.MyJSON->>'id' AS INTEGER), src.MyJSON FROM ( SELECT CAST(%s AS JSONB) AS MyJSON ) src" 

# Convert dictionary to native JSON data type
data2 = {"id": 4573457, "account_id": 456, "address": "15 Millers Rd, WA"}
data2_json = json.dumps(data2)

sql_params = (data2_json,)

# Execute SQL
calldb(conn, sql_cmd, sql_params) 
conn.commit()

Changes

  • added sql_params variable to calldb() to pass along sql parameters
  • added connection params line to connect to DB (not sure how you were doing it in yours)
  • converted data2 dictionary data type to JSON data type (that's related to why you were getting the previous "can't adapt dict" error)
  • Suggestion: Not sure if you do this or not, but you should also close your DB cursor and connection when you're done with them

You can clean it up and modify as needed. Give it a try and let me know.

Upvotes: 4

Ankur Srivastava
Ankur Srivastava

Reputation: 923

You can insert using jsonb data type. Pls refer to link below . https://www.compose.com/articles/faster-operations-with-the-jsonb-data-type-in-postgresql/

Upvotes: 0

Related Questions