我搭建的日志分析系统,用filebeat作为输入直接输入到es中,中间没有logstash。发现有个问题,监听的某个文件只要有修改就会把该文件所有数据输入到es中,早晨会有数据重复。请问这个问题怎么解决。我的filebeat配置文件如下:
filebeat.prospectors:
- type: log
enabled: true
json.keys_under_root: true
json.overwrite_keys: true
json.message_key: log
tail_files: true
harvester_buffer_size: 16384
backoff: "1s"
document_type: "car"
paths:
- /usr/local/elasticsearch/logs/*.log
#============================= Filebeat modules ===============================
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
#==================== Elasticsearch template setting ==========================
setup.template.pattern: "filebeat-%{[beat.version]}-*"
setup.template.name: "filebeat-%{[beat.version]}"
setup.template.settings:
index.number_of_shards: 3
output.elasticsearch:
hosts: ["192.168.220.111:9200"]
index: "filebeat-car-%{+yyyy.MM.dd}"
processors:
- decode_json_fields:
fields: ["message"]
process_array: false
max_depth: 1
target: ""
overwrite_keys: false
json.keys_under_root: true
json.overwrite_keys: true
还望大神解答!
大兄弟,你解决这个问题了吗?
我也遇到了,log文件有修改,filebeat会把所有的记录输入到logstash,而不是最新的一条log,求解~