Reputation: 38652
I incorrectly ingested lots of documents into Elasticsearch using the wrong @timestamp
field. I already changed the affected Logstash pipeline to use the correct timestamps, but I cannot re-ingest the old data.
I do however have another document field that can be used as the timestamp (json.created_at
). So I'd like to update the field. I've found that I can use the _update_by_query
action to do that, but I've tried several versions that didn't work, including this:
POST logstash-rails_models-*/_update_by_query
{
"script": {
"lang": "painless",
"source": "ctx._source.@timestamp = ctx._source.json.created_at"
}
}
This complains about an unexpected character:
{
"error": {
"root_cause": [
{
"type": "script_exception",
"reason": "compile error",
"script_stack": [
"ctx._source.@timestamp = ctx._source. ...",
" ^---- HERE"
],
"script": "ctx._source.@timestamp = ctx._source.json.created_at",
"lang": "painless"
}
],
"type": "script_exception",
"reason": "compile error",
"script_stack": [
"ctx._source.@timestamp = ctx._source. ...",
" ^---- HERE"
],
"script": "ctx._source.@timestamp = ctx._source.json.created_at",
"lang": "painless",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "unexpected character [@].",
"caused_by": {
"type": "lexer_no_viable_alt_exception",
"reason": null
}
}
},
"status": 500
}
What should I do?
Upvotes: 0
Views: 1576
Reputation: 38652
The correct way to access this field is via brackets and wrapped in quotes:
POST logstash-rails_models-*/_update_by_query
{
"script": {
"lang": "painless",
"source": "ctx._source['@timestamp'] = ctx._source.json.created_at"
}
}
See also this thread and some more info about updating fields with Painless.
Upvotes: 1