I need help with update docs in index. Filebeat upload CSV files from disk to index
FB config
- type: log
id: persons
enabled: true
paths:
- /data1/Logs/Preprod/Persons/*.csv
fields:
log_type: persons
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]
username: "***"
password: "***"
indices:
- index: "orders-from-csv"
when.equals:
fields.log_type: "orders-from-csv"
- index: "persons"
when.equals:
fields.log_type: "persons"
pipeline: "%{[fields.log_type]}-pipeline"
action: "index"
Here settings of pipeline
PUT _ingest/pipeline/persons-pipeline
{
"description": "Ingest pipeline created by text structure finder",
"processors": [
{
"csv": {
"field": "message",
"target_fields": [
"[Name]",
"[IIS]",
"[BirthDay]",
"column4",
"[OisKey]",
"[Qual]",
"[Address]",
"[Email]",
"[Id]"
],
"separator": ";",
"ignore_missing": false
}
},
{
"set": {
"field": "debug_original_message",
"value": "{{message}}"
}
},
{
"convert": {
"field": "[Id]",
"type": "long"
}
},
{
"date": {
"field": "column4",
"formats": [
"yyyy-MM-dd HH:mm:ss.SSS"
],
"target_field": "[RegDate]",
"timezone": "Europe/Moscow"
}
},
{
"date": {
"field": "column4",
"formats": [
"yyyy-MM-dd HH:mm:ss.SSS"
],
"target_field": "[RegDateDay]",
"output_format": "dd-MM-yyyy"
}
},
{
"remove": {
"field": "message"
}
},
{
"set": {
"field": "_id",
"value": "{{[OisKey]}}"
}
}
],
"on_failure": [
{
"set": {
"field": "error.message",
"value": "{{ _ingest.on_failure_message }}"
}
}
]
}
How do I update an existing document? Now when a document arrives with an id and diff some field that is already exist in the index, it is ignored.
UPDATE: I can update the document via API, but I need to update it automatically by checking for changes or by overwriting it