我已经配置了Filesbeat
,它能够从syslog
文件中提供的路径中读取新日志(现在为filebeat.yml
),并将其转发到Logstash
,然后它应将数据解析为[ C0]。
我在Kibana事件中的任何地方都看不到已解析的grok字段,例如syslog_timestamp,syslog_hostname,syslog_pid,我不知道为什么不解析数据的原因可能是什么。
Filebeat输入文件
Elasticearch
Grok筛选器(在Logstash中)
Kibana(Elasticsearch Json)
input{
beats{
port => "5044"
}
}
filter {
if[type] == "syslog"{
grok{
match => {"message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}"}
}
date {
match => ["syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss"]
}
}
}
output{
elasticsearch{
hosts => ["10.107.50.205:9200"]
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
}
}
{
"_index": "filebeat-2019.09.30",
"_type": "_doc",
"_id": "kss7g20B5aLjyCF-6L2B",
"_version": 1,
"_score": null,
"_source": {
"message": "Sep 30 18:33:20 ut012905 metricbeat[46882]: 2019-09-30T18:33:20.254+0100#011INFO#011[monitoring]#011log/log.go:145#011Non-zero metrics in the last 30s#011{\"monitoring\": {\"metrics\": {\"beat\":{\"cpu\":{\"system\":{\"ticks\":770020,\"time\":{\"ms\":80}},\"total\":{\"ticks\":2091400,\"time\":{\"ms\":172},\"value\":2091400},\"user\":{\"ticks\":1321380,\"time\":{\"ms\":92}}},\"handles\":{\"limit\":{\"hard\":4096,\"soft\":1024},\"open\":5},\"info\":{\"ephemeral_id\":\"63755af9-7bad-4b09-8909-52e7018409fe\",\"uptime\":{\"ms\":369450706}},\"memstats\":{\"gc_next\":23786560,\"memory_alloc\":12161776,\"memory_total\":453661591544,\"rss\":2052096},\"runtime\":{\"goroutines\":36}},\"libbeat\":{\"config\":{\"module\":{\"running\":0}},\"pipeline\":{\"clients\":3,\"events\":{\"active\":89,\"published\":47,\"total\":47}}},\"metricbeat\":{\"system\":{\"cpu\":{\"events\":3,\"success\":3},\"filesystem\":{\"events\":3,\"success\":3},\"fsstat\":{\"events\":1,\"success\":1},\"load\":{\"events\":3,\"success\":3},\"memory\":{\"events\":3,\"success\":3},\"network\":{\"events\":6,\"success\":6},\"process\":{\"events\":22,\"success\":22},\"process_summary\":{\"events\":3,\"success\":3},\"socket_summary\":{\"events\":3,\"success\":3}}},\"system\":{\"load\":{\"1\":0.04,\"15\":0.01,\"5\":0.04,\"norm\":{\"1\":0.04,\"15\":0.01,\"5\":0.04}}}}}}",
"host": {
"containerized": false,
"name": "ut012905",
"architecture": "x86_64",
"hostname": "ut012905",
"id": "74e969e835cbfe982aa3ed2f5d76fdd9",
"os": {
"platform": "ubuntu",
"name": "Ubuntu",
"version": "16.04.6 LTS (Xenial Xerus)",
"codename": "xenial",
"family": "debian",
"kernel": "4.4.0-161-generic"
}
},
"ecs": {
"version": "1.0.1"
},
"@version": "1",
"agent": {
"id": "afafb888-8d08-4a4b-8f4d-6c64291fb43d",
"version": "7.3.2",
"hostname": "ut012905",
"type": "filebeat",
"ephemeral_id": "57c8f630-00d5-4c88-bf2d-bb1102cd8530"
},
"log": {
"offset": 3218320,
"file": {
"path": "/var/log/syslog"
}
},
"tags": [
"myCluster1",
"beats_input_codec_plain_applied"
],
"input": {
"type": "log"
},
"fields": {
"env": "staging"
},
"@timestamp": "2019-09-30T17:33:23.354Z"
},
"fields": {
"@timestamp": [
"2019-09-30T17:33:23.354Z"
]
},
"sort": [
1569864803354
]
}
设置已从版本6.0的Filebeat中删除,因为您正在使用Filebeat 7.3,所以此设置将被忽略,并且您的消息中没有document_type
字段。
您需要使用type
添加新字段并更改管道以基于该字段进行过滤。