Gauri
Gauri

Reputation: 329

Why postgres table shows Null in every column after executing sql insert query with json data?

I am trying to insert json data in to a postgres table using this query-

INSERT INTO rf_dsgns 
SELECT * FROM json_populate_recordset(NULL::rf_dsgns,
'[
  {
    "Tracking_ID": 2377125,
    "Constr_Zone": "Cleveland",
    "AF_Name": "PbCleveland_10236716P",       
    "Address": "4755 1/2 Rose Avenue",
    "Zip_Code": 44867,
    "Latitude": 5.8923486,
    "Longitude": -71.71052258,        
  },{
    "Tracking_ID": 2377126,
    "Constr_Zone": "Cleveland",
    "AF_Name": "PggClevelandCLE_25236718P",       
    "Street_Address": "4413 1/3 Clain Avenue",  
    "Zip_Code": 44225,
    "Latitude": 40.88960254,
    "Longitude": -71.20898567,        
  }]');

Data types I have used while creating my table are integer,character,character,character,integer,numeric,numeric respectively. My create table script is

CREATE TABLE rf_dsgns
(
    tracking_id integer,
    constr_zone character(300),
    af_name character(300),       
    address character(300),        
    zip_code integer,
    latitude numeric,
    longitude numeric       
);

Upvotes: 0

Views: 507

Answers (1)

user330315
user330315

Reputation:

You most probably create the table without using double quotes for the column names (which is a good thing). However, json_populate_recordset() matches case sensitive, and thus the lower case column names in the table are not matched with the mixed case names in the JSON.

This:

create table rf_dsgns ("Tracking_ID" int, "Constr_Zone" text, "AF_Name" text, "Address" text, "Zip_Code" text, "Latitude" numeric, "Longitude" numeric);

SELECT * 
FROM json_populate_recordset(NULL::rf_dsgns,
'[
  {
    "Tracking_ID": 2377125,
    "Constr_Zone": "Cleveland",
    "AF_Name": "PbCleveland_10236716P",       
    "Address": "4755 1/2 Rose Avenue",
    "Zip_Code": 44867,
    "Latitude": 5.8923486,
    "Longitude": -71.71052258
  },{
    "Tracking_ID": 2377126,
    "Constr_Zone": "Cleveland",
    "AF_Name": "PggClevelandCLE_25236718P",       
    "Street_Address": "4413 1/3 Clain Avenue",  
    "Zip_Code": 44225,
    "Latitude": 40.88960254,
    "Longitude": -71.20898567        
  }]');

returns:

Tracking_ID | Constr_Zone | AF_Name                   | Address              | Zip_Code | Latitude    | Longitude   
------------+-------------+---------------------------+----------------------+----------+-------------+-------------
    2377125 | Cleveland   | PbCleveland_10236716P     | 4755 1/2 Rose Avenue |    44867 |   5.8923486 | -71.71052258
    2377126 | Cleveland   | PggClevelandCLE_25236718P |                      |    44225 | 40.88960254 | -71.20898567

However when the table is created without quotes:

create table rf_dsgns (tracking_id int, constr_zone text, af_name text, address text, zip_code text, latitude numeric, longitude numeric);

Then no columns are matched and everything will be NULL

I would not re-create the table with case sensitive column names. I can see two workarounds: create a new type that uses the quoted identifiers but is identical to the table definition and use that to map the JSON data.

Use json_array_elements() instead and spell out all column names manually That is a bit more typing but does not duplicate the type definition (and is actually a bit more flexible):

insert into rf_dsgns
SELECT (j ->> 'Tracking_ID')::int, 
       j ->> 'Constr_Zone',
       j ->> 'AF_Name',
       j ->> 'Addres', 
       j ->> 'Zip_Code',
       (j ->> 'Latitude')::numeric,
       (j ->> 'longitude')::numeric
FROM json_array_elements('.... your json here ') a t(j);

Upvotes: 3

Related Questions