Sean Kilb
Sean Kilb

Reputation: 979

How can I convert JSON to a database table dynamically?

I need to save JSON data to an Oracle database. The JSON looks like this(see below). But it doesn't stay in the same format. I might add some additional nodes or modify existing ones. So is it possible to create or modify oracle tables dynamically to add more columns? I was going to do that with Java. I will create a Java class matching the JSON, convert JSON to Java object and persist it to the table. But how can I modify Java class dynamically? Or would it be better idea to do that with PL/SQL? The JSON comes from a mobile device to a REST web service.

{"menu": {
  "id": "file",
  "value": "File",
  "popup": {
    "menuitem": [
      {"value": "New", "onclick": "CreateNewDoc()"},
      {"value": "Open", "onclick": "OpenDoc()"},
      {"value": "Close", "onclick": "CloseDoc()"}
    ]
  }
}}

Upvotes: 2

Views: 6543

Answers (5)

shubhi agarwal
shubhi agarwal

Reputation: 1

This can be done in MYSQL database:

This code takes a JSON input string and automatically generates SQL Server CREATE TABLE statements to make it easier to convert serialized data into a database schema.

It is not perfect, but should provide a decent starting point when starting to work with new JSON files.

    SET NOCOUNT ON;
    
    DECLARE 
        @JsonData nvarchar(max) = '
            {
                "Id" : 1,
                "IsActive":true,
                "Ratio": 1.25,
                "ActivityArray":[true,false,true],
                "People" : ["Jim","Joan","John","Jeff"],
                "Places" : [{"State":"Connecticut", "Capitol":"Hartford", "IsExpensive":true},{"State":"Ohio","Capitol":"Columbus","MajorCities":["Cleveland","Cincinnati"]}],
                "Thing" : { "Type":"Foo", "Value" : "Bar" },
                "Created_At":"2018-04-18T21:25:48Z"
            }',
        @RootTableName nvarchar(4000) = N'AppInstance',
        @Schema nvarchar(128) = N'dbo',
        @DefaultStringPadding smallint = 20;
    
    DROP TABLE IF EXISTS ##parsedJson;
    WITH jsonRoot AS (
        SELECT 
            0 as parentLevel, 
            CONVERT(nvarchar(4000),NULL) COLLATE Latin1_General_BIN2 as parentTableName, 
            0 AS [level], 
            [type] ,
            @RootTableName COLLATE Latin1_General_BIN2 AS TableName,
            [key] COLLATE Latin1_General_BIN2 as ColumnName,
            [value],
            ROW_NUMBER() OVER (ORDER BY (SELECT 1)) AS ColumnSequence
        FROM 
            OPENJSON(@JsonData, '$')
        UNION ALL
        SELECT 
            jsonRoot.[level] as parentLevel, 
            CONVERT(nvarchar(4000),jsonRoot.TableName) COLLATE Latin1_General_BIN2, 
            jsonRoot.[level]+1, 
            d.[type],
            CASE WHEN jsonRoot.[type] IN (4,5) THEN CONVERT(nvarchar(4000),jsonRoot.ColumnName) ELSE jsonRoot.TableName END COLLATE Latin1_General_BIN2,
            CASE WHEN jsonRoot.[type] IN (4) THEN jsonRoot.ColumnName ELSE d.[key] END,
            d.[value],
            ROW_NUMBER() OVER (ORDER BY (SELECT 1)) AS ColumnSequence
        FROM 
            jsonRoot
            CROSS APPLY OPENJSON(jsonRoot.[value], '$') d
        WHERE 
            jsonRoot.[type] IN (4,5) 
    ), IdRows AS (
        SELECT 
            -2 as parentLevel,
            null as parentTableName,
            -1 as [level],
            null as [type],
            TableName as Tablename,
            TableName+'Id' as columnName, 
            null as [value],
            0 as columnsequence
        FROM 
            (SELECT DISTINCT tablename FROM jsonRoot) j
    ), FKRows AS (
        SELECT 
            DISTINCT -1 as parentLevel,
            null as parentTableName,
            -1 as [level],
            null as [type],
            TableName as Tablename,
            parentTableName+'Id' as columnName, 
            null as [value],
            0 as columnsequence
        FROM 
            (SELECT DISTINCT tableName,parentTableName FROM jsonRoot) j
        WHERE 
            parentTableName is not null
    )
    SELECT 
        *,
        CASE [type]
            WHEN 1 THEN 
                CASE WHEN TRY_CONVERT(datetime2, [value], 127) IS NULL THEN 'nvarchar' ELSE 'datetime2' END
            WHEN 2 THEN 
                CASE WHEN TRY_CONVERT(int, [value]) IS NULL THEN 'float' ELSE 'int' END
            WHEN 3 THEN 
                'bit'
            END COLLATE Latin1_General_BIN2 AS DataType,
        CASE [type]
            WHEN 1 THEN 
                CASE WHEN TRY_CONVERT(datetime2, [value], 127) IS NULL THEN MAX(LEN([value])) OVER (PARTITION BY TableName, ColumnName) + @DefaultStringPadding ELSE NULL END
            WHEN 2 THEN 
                NULL
            WHEN 3 THEN 
                NULL
            END AS DataTypePrecision
    INTO ##parsedJson
    FROM jsonRoot
    WHERE 
        [type] in (1,2,3)
    UNION ALL SELECT IdRows.parentLevel, IdRows.parentTableName, IdRows.[level], IdRows.[type], IdRows.TableName, IdRows.ColumnName, IdRows.[value], -10 AS ColumnSequence, 'int IDENTITY(1,1) PRIMARY KEY' as datatype, null as datatypeprecision FROM IdRows 
    UNION ALL SELECT FKRows.parentLevel, FKRows.parentTableName, FKRows.[level], FKRows.[type], FKRows.TableName, FKRows.ColumnName, FKRows.[value], -9 AS ColumnSequence, 'int' as datatype, null as datatypeprecision FROM FKRows 
    
    -- For debugging:
    -- SELECT * FROM ##parsedJson ORDER BY ParentLevel, level, tablename, columnsequence
    
    DECLARE @CreateStatements nvarchar(max);
    
    SELECT
        @CreateStatements = COALESCE(@CreateStatements + CHAR(13) + CHAR(13), '') + 
        'CREATE TABLE ' + @Schema + '.' + TableName + CHAR(13) + '(' + CHAR(13) +
            STRING_AGG( ColumnName + ' ' + DataType + ISNULL('('+CAST(DataTypePrecision AS nvarchar(20))+')','') +  CASE WHEN DataType like '%PRIMARY KEY%' THEN '' ELSE ' NULL' END, ','+CHAR(13)) WITHIN GROUP (ORDER BY ColumnSequence) 
        + CHAR(13)+')'
    FROM
        (SELECT DISTINCT 
            j.TableName, 
            j.ColumnName,
            MAX(j.ColumnSequence) AS ColumnSequence, 
            j.DataType, 
            j.DataTypePrecision, 
            j.[level] 
        FROM 
            ##parsedJson j
            CROSS APPLY (SELECT TOP 1 ParentTableName + 'Id' AS ColumnName FROM ##parsedJson p WHERE j.TableName = p.TableName ) p
        GROUP BY
            j.TableName, j.ColumnName,p.ColumnName, j.DataType, j.DataTypePrecision, j.[level] 
        ) j
    GROUP BY
        TableName
    
    
    PRINT @CreateStatements;

You can find the solution on https://bertwagner.com/posts/converting-json-to-sql-server-create-table-statements/

ALso JSON can be converted to a POJO class in JAVA :

package com.cooltrickshome;
2 import java.io.File;
3 import java.io.IOException;
4 import java.net.MalformedURLException;
5 import java.net.URL;
6 import org.jsonschema2pojo.DefaultGenerationConfig;
7 import org.jsonschema2pojo.GenerationConfig;
8 import org.jsonschema2pojo.Jackson2Annotator;
9 import org.jsonschema2pojo.SchemaGenerator;
10 import org.jsonschema2pojo.SchemaMapper;
11 import org.jsonschema2pojo.SchemaStore;
12 import org.jsonschema2pojo.SourceType;
13 import org.jsonschema2pojo.rules.RuleFactory;
14 import com.sun.codemodel.JCodeModel;
15 public class JsonToPojo {
16 /**
17 * @param args
18 */
19 public static void main(String[] args) {
20 String packageName="com.cooltrickshome";
21 File inputJson= new File("."+File.separator+"input.json");
22 File outputPojoDirectory=new File("."+File.separator+"convertedPojo");
23 outputPojoDirectory.mkdirs();
24 try {
25 new JsonToPojo().convert2JSON(inputJson.toURI().toURL(), outputPojoDirectory, packageName, inputJson.getName().replace(".json", ""));
26 } catch (IOException e) {
27 // TODO Auto-generated catch block
28 System.out.println("Encountered issue while converting to pojo: "+e.getMessage());
29 e.printStackTrace();
30 }
31 }
32 public void convert2JSON(URL inputJson, File outputPojoDirectory, String packageName, String className) throws IOException{
33 JCodeModel codeModel = new JCodeModel();
34 URL source = inputJson;
35 GenerationConfig config = new DefaultGenerationConfig() {
36 @Override
37 public boolean isGenerateBuilders() { // set config option by overriding method
38 return true;
39 }
40 public SourceType getSourceType(){
41 return SourceType.JSON;
42 }
43 };
44 SchemaMapper mapper = new SchemaMapper(new RuleFactory(config, new Jackson2Annotator(config), new SchemaStore()), new SchemaGenerator());
45 mapper.generate(codeModel, className, packageName, source);
46 codeModel.build(outputPojoDirectory);
47 }
48 }

Upvotes: 0

Pascal Zoleko
Pascal Zoleko

Reputation: 699

Use https://github.com/zolekode/json-to-tables/.

Here you go:

import json
from core.extent_table import ExtentTable
from core.table_maker import TableMaker

menu = {
  "id": "file",
  "value": "File",
  "popup": {
    "menuitem": [
      {"value": "New", "onclick": "CreateNewDoc()"},
      {"value": "Open", "onclick": "OpenDoc()"},
      {"value": "Close", "onclick": "CloseDoc()"}
    ]
  }
}

menu = json.dumps(menu)
menu = json.loads(menu)

extent_table = ExtentTable()
table_maker = TableMaker(extent_table)
table_maker.convert_json_object_to_table(menu, "menu")
table_maker.show_tables(8)
table_maker.save_tables("menu", export_as="sql", sql_connection="your_connection")

Output:

SHOWING TABLES :D

menu
   ID    id value popup
0   0  file  File     0
1   1  None  None  None
____________________________________________________


popup
   ID
0   0
1   1
____________________________________________________


popup_?_menuitem
  ID PARENT_ID is_scalar scalar
0  0         0     False   None
1  1         0     False   None
2  2         0     False   None
____________________________________________________


popup_?_menuitem_$_onclick
   ID  value         onclick PARENT_ID
0   0    New  CreateNewDoc()         0
1   1   Open       OpenDoc()         1
2   2  Close      CloseDoc()         2
3   3   None            None      None
____________________________________________________

Upvotes: 0

Peter Lawrey
Peter Lawrey

Reputation: 533520

Mongodb may be your best choice, or you could have a large TEXT field and only extract the columns you are likely to search one.

However, you can CREATE TABLE for additional normalised data and ALTER TABLE to add a column. The later can be particularity expensive.

Upvotes: 0

lagunex
lagunex

Reputation: 156

I suggested in the comments to change your approach to a NoSQL database like MongoDB. However, if you still feel you need to use a relational database, maybe the EAV model can point you in the right direction.

In summay, you would have a "helper" table that stores which columns an Entity has and their types.

You cannot modify a Java class but you can define the class like a Map and implement the logic to add the desired columns.

Magento, a PHP product, uses EAV in its database.

Upvotes: 0

scrayne
scrayne

Reputation: 718

I would suggest that you avoid creating new columns, and instead create a new table that will contain one entry for each of what would have been the new columns. I'm assuming here that the new columns would be menu items. So you would have a "menu" table with these columns:

id    file

and you would have a "menuitem" table which would contain one entry for each of your menu items:

id    value    onclick

So instead of adding columns dynamically, you would be adding records.

Upvotes: 1

Related Questions