我在同一个服务器上部署了filebeat、logstash和elasticsearch,设想中的流程是filebeat读取文件然后输出到logstash中,logstash再将输入的文件输出到elasticsearch中。但是在配置完成后却无法在kibana或浏览器插件es-client查看到读取的内容。我在这之前是用filebeat直接往elasticsearch中输出文件是没有问题的,在esclient中是能看到读取的数据的。以下是我的相关配置:
系统:centOS 7
filebeat配置
filebeat.inputs:
- type: log
enabled: true
paths:
- /home/kuisahn/data/*.json
fields:
type: test_data
output.logstash:
enabled: true
hosts: ["localhost:5044"]
logstash配置文件
input {
beats {
host => "0.0.0.0"
port => 5044
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "testdata-%{+YYYY.MM.dd}"
}
}
启动顺序为县启动es,再启动logstash,然后启动filebeat
启动logstash后的日志为
Using JAVA_HOME defined java: /usr/local/jdk1.8
WARNING, using JAVA_HOME while Logstash distribution comes with a bundled JDK
Sending Logstash logs to /usr/local/logstash-7.10.0/logs which is now configured via log4j2.properties
[2023-02-08T21:01:37,316][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.10.0", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc Java HotSpot(TM) 64-Bit Server VM 25.351-b10 on 1.8.0_351-b10 +indy +jit [linux-x86_64]"}
[2023-02-08T21:01:37,787][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2023-02-08T21:01:39,283][INFO ][org.reflections.Reflections] Reflections took 44 ms to scan 1 urls, producing 23 keys and 47 values
[2023-02-08T21:01:40,039][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2023-02-08T21:01:40,242][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2023-02-08T21:01:40,286][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2023-02-08T21:01:40,289][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2023-02-08T21:01:40,411][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2023-02-08T21:01:40,510][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2023-02-08T21:01:40,570][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/usr/local/logstash-7.10.0/config/logstash_filebeat2es.conf"], :thread=>"#<Thread:0x45b54ed0 run>"}
[2023-02-08T21:01:40,613][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2023-02-08T21:01:41,263][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.69}
[2023-02-08T21:01:41,292][INFO ][logstash.inputs.beats ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2023-02-08T21:01:41,321][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-02-08T21:01:41,413][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2023-02-08T21:01:41,516][INFO ][org.logstash.beats.Server][main][bc81c40e052975556da35718c4654e42ce5599b98bb2848b75ffd10d86fc754c] Starting server on port: 5044
[2023-02-08T21:01:41,798][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
在我修改/home/kuisahn/data/data2.json
后filebeat的日志
2023-02-08T22:09:46.758-0500 INFO log/harvester.go:302 Harvester started for file: /home/kuisahn/data/data2.json
2023-02-08T22:09:47.761-0500 INFO [publisher_pipeline_output] pipeline/output.go:143 Connecting to backoff(async(tcp://localhost:5044))
2023-02-08T22:09:47.761-0500 INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer
2023-02-08T22:09:47.761-0500 INFO [publisher] pipeline/retry.go:223 done
2023-02-08T22:09:47.762-0500 INFO [publisher_pipeline_output] pipeline/output.go:151 Connection to backoff(async(tcp://localhost:5044)) established
2023-02-08T22:10:06.555-0500 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":720,"time":{"ms":5}},"total":{"ticks":1010,"time":{"ms":11},"value":1010},"user":{"ticks":290,"time":{"ms":6}}},"handles":{"limit":{"hard":65535,"soft":65535},"open":13},"info":{"ephemeral_id":"c7792d34-2a03-4427-ac99-4d0844cf3724","uptime":{"ms":4080083}},"memstats":{"gc_next":17859600,"memory_alloc":12665080,"memory_total":96501440,"rss":491520},"runtime":{"goroutines":30}},"filebeat":{"events":{"added":9,"done":9},"harvester":{"open_files":1,"running":1,"started":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":7,"batches":1,"total":7},"read":{"bytes":6},"write":{"bytes":2470}},"pipeline":{"clients":1,"events":{"active":0,"filtered":2,"published":7,"retry":7,"total":9},"queue":{"acked":7}}},"registrar":{"states":{"cleanup":1,"current":6,"update":9},"writes":{"success":3,"total":3}},"system":{"load":{"1":0,"15":0.05,"5":0.01,"norm":{"1":0,"15":0.025,"5":0.005}}}}}}
2023-02-08T22:10:36.554-0500 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":720,"time":{"ms":3}},"total":{"ticks":1010,"time":{"ms":8},"value":1010},"user":{"ticks":290,"time":{"ms":5}}},"handles":{"limit":{"hard":65535,"soft":65535},"open":13},"info":{"ephemeral_id":"c7792d34-2a03-4427-ac99-4d0844cf3724","uptime":{"ms":4110083}},"memstats":{"gc_next":18099664,"memory_alloc":9086504,"memory_total":96676688,"rss":229376},"runtime":{"goroutines":30}},"filebeat":{"harvester":{"open_files":1,"running":1}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":0}}},"registrar":{"states":{"current":6}},"system":{"load":{"1":0,"15":0.05,"5":0.01,"norm":{"1":0,"15":0.025,"5":0.005}}}}}}
修改原始日志文件后filebeat的日志发生了变化,但是logstash的日志完全没有变化
连接elasticsearch后只能看到kibana自己建的一个索引和用filebeat直接往elasticsearch推数据建的索引外没有从logstash推过来的数据。