Reputation: 9024
I am sending 1000 at a time to Elasticsearch Bulk API
I have a scenario where i want to update a field inside a document which is equivalent to the multiplication of two fields in the same document. Here is how i am building my query via PHP client
{ "update" : { "_id" : "0", "_type" : "type1", "_index" : "index1"} }
{ "script" : { "inline": "ctx._source.x=ctx._source.y*ctx._source.z"} }
I am using PHP client for this which sends 1000 at one go.
For total of 185000 records its taking 30 mins. Is there any way that i can optimize it ?
Upvotes: 2
Views: 1624
Reputation: 1871
If yours script is different from document to document only by factor/parameter then you can create appropriate stored script and then use it in partial update with unique parameter value for each document. This way elastic won't bother with script recompile for each document and byte payload of bulk request will be significantly smaller.
Upvotes: 2