Reputation: 21
I am loading JSON data to Cassandra table through Python script. But few Json files have more columns than usual. Currently i have created table with 100 columns and able to insert all. But there are chances that few json files will have more than 100 columns. How to handle this? Is there any way we can create dynamic columns if Json has more columns than table?
Upvotes: 1
Views: 932
Reputation: 87164
If you're using CQL, then you need to define all columns before inserting the data. Theoretically you can use ALTER TABLE add ...
to add new columns, but usually it's not recommended to do programmatically as it may cause schema disagreement, and other problems.
You may workaround this problem as following:
text
, and value corresponding to actual value type - int
, text
, etc. For example:create table test (
pk1 ..,
pk2 ..,
pkN ..,
imap frozen<map<text, int>,
tmap frozen<map<text, text>,
...
primary key(pk1, pk2, ...)
);
and then in your code, separate columns by types, and insert into corresponding map.
Upvotes: 1