user3609179
user3609179

Reputation: 301

Creating dynamoDB table from JSON file

I have a massive JSON file that has many items. I am trying to create a DynamoDB table based on the JSON file without having to enter each individual attribute. I have tried the following with the AWSCLi

aws dynamodb create-table --cli-input-json file://tabledefinition.json
aws dynamodb create-table --generate-cli-skeleton

as mentioned here ( Create a DynamoDB table from json ). I have also looked at a couple python libraries (ex https://github.com/jlafon/PynamoDB) that does not have an option. All the json files are stored in an S3 bucket.

Upvotes: 2

Views: 4160

Answers (1)

notionquest
notionquest

Reputation: 39226

Note:

This is a very generic class as OP doesn't have specific table definition mentioned.

The table definition JSON is not equal to the normal data JSON. Please see the example of table definition JSON. You need to create similar one to create the table using JSON and AWS CLI.

{
    "TableName": "MusicCollection2",
    "KeySchema": [
      { "AttributeName": "Artist", "KeyType": "HASH" },
      { "AttributeName": "SongTitle", "KeyType": "RANGE" }
    ],
    "AttributeDefinitions": [
      { "AttributeName": "Artist", "AttributeType": "S" },
      { "AttributeName": "SongTitle", "AttributeType": "S" }
    ],
    "ProvisionedThroughput": {
      "ReadCapacityUnits": 5,
      "WriteCapacityUnits": 5
    }
}

Data Load Options:-

Option 1:-

Once the table is created, you can write Python code to load the data into DynamoDB. Please note that you need to map each attributes on JSON to attributes on DynamoDB table (or) store the JSON as MAP on DynamoDB table. This depends on your use case i.e. how you are going to use the data loaded into DynamoDB table.

Option 2:-

You can use AWS Data Pipeline to create the mapping and load the data into DynamoDB table. If it is a one time load, you can delete the Data Pipeline once the load is complete.

Upvotes: 3

Related Questions