Reputation: 2445
What is the best way to index constantly changing data in a PostgreSQL database to a Solr/Elasticsearch database?
I have a postgres database on AWS RDS and i want to perform complex search on it. However the data i will query against is constantly changing with very high writes/ updates. So i am not sure how i should transfer the data to the solr/ elasticsearch efficiently and reliably.
Thanks for the help
Upvotes: 8
Views: 9992
Reputation: 1063
At the risk of someone marking this question as a duplicate, here's the link to setting up postgres-to-elasticsearch in another StackOverflow thread. There's also this blog post on Atlassian that also talks about how to get real time updates from PostgreSQL into ElasticSearch.
The Atlassian thread, for the tl;dr crowd, uses stored PGS procedures to copy updated/inserted data to a staging table, then separately processes the staging table. It's a nice approach that would work for either ES or Solr. Unfortunately, it's a roll-your-own solution, unless you are familiar with Clojure.
Upvotes: 9
Reputation: 24870
In case of Solr
, a general approach is to use Data Import Handler
(DIH
for short). Config the full-import & delta-import sql properly, where delta import
import data from database that changes since last import judging via timestamps (so, u need design schema with proper timestamps).
The timing of delta-import, has 2 styles which could be used separately or combined:
Refer to https://cwiki.apache.org/confluence/display/solr/Uploading+Structured+Data+Store+Data+with+the+Data+Import+Handler for DIH
detail.
Upvotes: 5