zz10
zz10

Reputation: 71

elastic dump fail with java script out pf memory

used this commands

elasticdump --input=/opt/index_5.json --output=http://esserver:9200/index_5 --limit=5000 --transform="doc._source=Object.assign({},doc)"

Error like below while importing the data

<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x3b9faf49e6e9 1: stringSlice(aka stringSlice) [0x8c113e13429] [buffer.js:~589] [pc=0x3cfe067fcdcf](this=0x34873cd026f1 ,buf=0x15dd55450ef1 ,encoding=0x3b9faf4bdd31 ,start=0,end=8) 2: write [0x2bf9d6645199] [/usr/lib/node_modules/elasticdump/node_modules/jsonparse/jsonparse.js:~127] [pc=0x3cfe06d95bbd](th...

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory 1: 0x8fa0c0 node::Abort() [node] 2: 0x8fa10c [node] Aborted

Upvotes: 2

Views: 1161

Answers (2)

gerardinio
gerardinio

Reputation: 1

I think that root cause of this out of memory is to use by you limit parameter with huge number value 5000 --limit=5000 (default id 100). Sometime default is too much and I see the same issue then I just change this value to --limit=10 for example.

Upvotes: 0

Lukas Nevosad
Lukas Nevosad

Reputation: 294

In my case downgrading to elasticsearch 6.10 solved a similar memory issue. See https://github.com/taskrabbit/elasticsearch-dump/issues/628

Upvotes: 0

Related Questions