Reputation: 183
I have 50 tables that have data related to each other. Usually if there are only 2-3 tables, we create SP to join all tables and use the output to create a single index for elastic search and then kibana.
But with 50 tables, that will mean hundreds of columns for the SP output (and each table has 10,000-1,000,000 rows), SP doesn't seem a good way.
I tried to create 50 index with similar name, and then use index pattern. But when creating dashboard form index pattern, it seems that there is actually no 'join' feature there.
Here are 2 sample table structures:
Table1 has: Departmentid, department address, deparment name Table2 has: employeename, Departmentid, employeeage
So expecting these 2 tables to join together with 'Departmentid' as the key. Remember that we have 50 tables, and lots of data from each table.
Any suggestions? Ella
Upvotes: 0
Views: 764
Reputation: 1696
Elasticsearch is not your traditional RDBMS. So there is no concept of joins. There is limited join support using parent/child within same index. But that has relatively bit slower performance compared to denormalized flat document structure or using nested attributes.
You will need to first model your index structure and determine, what part of 50 tables/columns is meaningful enough to bring on to elasticsearch. Just because its called schemaless and you have the option to dump it does not necessary mean that you load everything into your index.
So my suggestion is to spend time modeling your index structure first, decide what attributes are needed now and anticipate in future, then finalize your sourcing query.
Also, depending on how frequently you need to load data into your index, you may not have to go the SP route. You can use logstash to load data using the underlying SQL query or you can code cluster data loader leveraging ES client libraries. Worst case, you can even even use CURL to directly call the bulk API with prepared json data files. So pick whatever is most appropriate for your use case.
Upvotes: 1