im使用安装了x-pack的麋鹿堆栈7.1.1。
我想将json日志从filebeat发送到logstash并对其应用过滤器。我尝试了所有内容,但没有成功,请帮助我解决它。
样本json日志:
{"log":"2019-10-01 07:18:26:854*[DEBUG]*cluster2-nio-worker-0*Connection*userEventTriggered*Connection[cassandraclient/10.3.254.137:9042-1, inFlight=0, closed=false] was inactive for 30 seconds, sending heartbeat\n","stream":"stdout","time":"2019-10-01T07:18:26.85462769Z"}
{"log":"2019-10-01 07:18:26:854*[DEBUG]*cluster2-nio-worker-0*Connection*userEventTriggered*Connection[cassandraclient/10.3.254.137:9042-1, inFlight=0, closed=false] was inactive for 30 seconds, sending heartbeat\n","stream":"stdout","time":"2019-10-01T07:18:26.85462769Z"}
我只想将过滤器应用于log的值
2019-10-01 07:18:26:854 * [DEBUG] * cluster2-nio-worker-0 * Connection userEventTriggered Connection [cassandraclient / 10.3.254.137:9042-1,inFlight = 0,已关闭= false]闲置了30秒钟,正在发送心跳\ n“,” stream“:” stdout“,” time“:” 2019-10-01T07:18:26.85462769Z
我的文件拍配置:
文件拍子配置:
filebeat.inputs:
- type: log
enabled: true
paths:
- /home/Desktop/a.log
json.keys_under_root: true
json.message_key: log
tags: ["kubelogs"]
output.logstash:
hosts: ["localhost:5044"]
logstash.conf
input {
beats {
port => "5044"
}
}
filter {
if "kubelogs" in [tags] {
mutate {
rename => ["log", "message"]
}
date {
match => ["time", "ISO8601"]
remove_field => ["time"]
}
if [message] =~ /\d{15}/ {
grok {
match => ["message","%{TIMESTAMP_ISO8601:date}\*\[%{LOGLEVEL:log-level}\]\*%{DATA:thread}\*%{DATA:class}\*%{DATA:method}\*%{DATA:imei}\*%{DATA:token}\*%{GREEDYDATA:messagedata}"]
}
}
else {
grok {
match => ["message","%{TIMESTAMP_ISO8601:date}\*\[%{LOGLEVEL:log-level}\]\*%{DATA:thread}\*%{DATA:class}\*%{DATA:method}\*%{GREEDYDATA:messagedata}"]
}
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
manage_template => false
user => ####
password => #####
}
stdout { codec => rubydebug }
}
输出:我将以下内容作为输出而不是日志消息,并且未创建任何消息字段。
请帮助我解决。
{
"ecs" => {
"version" => "1.0.0"
},
"input" => {
"type" => "log"
},
"agent" => {
"version" => "7.1.1",
"ephemeral_id" => "39c0a41b-579e-47cd-b3c2-777c2db8194c",
"type" => "filebeat",
"hostname" => "qolsys-desktop",
"id" => "885ffebc-9cae-4b85-abf5-3b4f3b1faa29"
},
"@version" => "1",
"log" => {
"offset" => 756,
"file" => {
"path" => "/home/Desktop/a.log"
}
},
"time" => "2019-10-01T07:18:28.712685255Z",
"tags" => [
[0] "kubelogs",
[1] "beats_input_raw_event"
],
"host" => {
"id" => "3adafe86fcc05c016e767ae85c6124f5",
"architecture" => "x86_64",
"os" => {
"version" => "14.04.6 LTS, Trusty Tahr",
"codename" => "trusty",
"family" => "debian",
"kernel" => "4.4.0-148-generic",
"name" => "Ubuntu",
"platform" => "ubuntu"
},
"hostname" => "qolsys-desktop",
"containerized" => false,
"name" => "qolsys-desktop"
}
1。重命名mutate过滤器插件的操作
如documentation中所述,重命名操作基于哈希。所以看起来像这样:
mutate {
rename => { "log" => "message" }
}
2。 grok过滤器插件的匹配操作
您应遵循documentation中解释的语法。
如果要根据单个模式匹配字段,则配置应如下所示:
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:date}\*\[%{LOGLEVEL:log-level}\]\*%{DATA:thread}\*%{DATA:class}\*%{DATA:method}\*%{DATA:imei}\*%{DATA:token}\*%{GREEDYDATA:messagedata}" }
}
}
并且,如果您想与多种模式匹配,它看起来像这样:
grok {
match =>
"message" => [
"%{TIMESTAMP_ISO8601:date}\*\[%{LOGLEVEL:log-level}\]\*%{DATA:thread}\*%{DATA:class}\*%{DATA:method}\*%{DATA:imei}\*%{DATA:token}\*%{GREEDYDATA:messagedata}",
"PATTERN2",
"PATTERN3"
]
}
}
您在此处未遵循任何允许的语法。
随着更改,将创建消息字段。也许您的骗子模式无法立即生效,但是imo超出了此问题的范围。您可以在this site或Kibana DevTools中测试grok模式。
希望我能为您提供帮助。