Reputation: 1
I collect the all container log by using OpenSearch. sometimes, the log has different type value like this.
ex1) { "name": "david", "age": "20" }
ex2) { "name": "david", "age": 20 }
I know that the data modeling or mapping design process needs to be modified. But, It takes a lot of time, and I don't have clearance about that :( So I want to save all of data regardless of data type.
Is it possible to save it by using Muti-field function and Ingest pipeline? (with same field name). plz let me know the good way to solve it. Thanks.
Upvotes: -1
Views: 147
Reputation: 1942
I guess you are looking for convert
processor:
POST _ingest/pipeline/_simulate
{
"pipeline": {
"description": "debug",
"processors": [
{
"convert": {
"field": "end",
"type": "integer"
}
}
]
},
"docs": [
{
"_index": "banana",
"_type": "_doc",
"_id": "125468",
"_score": 1,
"_source": {
"end": 2000
}
},
{
"_index": "banana",
"_type": "_doc",
"_id": "125468",
"_score": 1,
"_source": {
"end": "2000"
}
}
]
}
The example you shared above will work without issue. The mapping will be keyword only. In the example I gave above, the mapping will be long. But, whenever you sent a value which could not cast to integer, you will get an error like :
unable to convert [2000-01-01] to integer
Upvotes: 0