我有以下格式登录,它是一个带有嵌套字段的普通json。
{
"level": "info",
"message": {
"req": {
"headers": {
"host": "localhost:8080",
"connection": "keep-alive",
"x-forwarded-for": "192.168.1.1, 1.1.1.1",
"x-forwarded-proto": "http"
},
"url": "/products?userId=493d0aec-a9a7-42a3",
"method": "GET",
"originalUrl": "/products?userId=493d0aec-a9a7-42a3",
"params": {
"0": "/products"
},
"query": {
"userId": "493d0aec-a9a7-42a3"
},
"body": ""
},
"res": {
"headers": {
"traceid": "ac586e4e924048",
"x-correlation-id": "57d7920d-b623-48f8",
"content-type": "application/json;charset=UTF-8",
"content-length": "2",
"date": "Fri, 08 Mar 2019 09:55:45 GMT",
"connection": "close"
},
"statusCode": 200,
"body": "[]"
},
"gateway": "internal"
},
"correlationId": "57d7920d-b623-48f8",
"timestamp": "2019-03-08T09:55:45.833Z"
}
如何使用Filebeat和Logstash正确解析它以将Kibana中的所有json字段视为单独(已解析)字段?我有“消息”字段的问题,它具有嵌套的json字段。我没有问题解析在“message”中有字符串的事件,但不是json。
我的尝试:
1。我试图告诉Filebeat它是一个具有以下配置的json: (在LS方面什么都不做)
filebeat.inputs:
- type: stdin
json.keys_under_root: true
json.add_error_key: true
结果对我来说很奇怪,因为我在Kibana中将“消息”作为一个字符串,所有:
都被=>
取代
{
"req" => {
"originalUrl" => "/offers", "params" => {
"0" => "/offers"
}, "query" => {}, "body" => "", "headers" => {
"accept-encoding" => "gzip", "user-agent" => "okhttp/3.8.1", "x-consumer-id" => "f2a6e4cd-2224-4535
“消息”之外的其他字段被正确解析
2。我在Filebeat端没有做任何事情并在LS中使用过滤器:
json {
source => "message"
target => "message_json"
}
日志根本没有出现在Kibana中,我在LS中遇到以下错误:
[2019-03-08T09:55:47,084] [警告] [logstash.outputs.elasticsearch]无法将事件索引到Elasticsearch。 {:status => 400,:action => [“index”,{:_id => nil,:_index =>“filebeat-6.5.0-2019.03.08-sdx”,:_ type =>“doc”,: routing => nil},#],:response => {“index”=> {“_ index”=>“filebeat-6.5.0-2019.03.08-sdx”,“_ type”=>“doc”,“id “=>”ERS6XGkBgE-US7A6Mvt“,”status“=> 400,”error“=> {”type“=>”mapper_parsing_exception“,”reason“=>”无法解析类型为[keyword]的字段[json.message] ]“,”caused_by“=> {”type“=>”illegal_state_exception“,”reason“=>”无法在START_OBJECT上以1:461获取文字“}}}}}} [2019-03-08T09:55 :47,085] [WARN] [logstash.outputs.elasticsearch]无法将事件索引到Elasticsearch。 {:status => 400,:action => [“index”,{:_id => nil,:_index =>“filebeat-6.5.0-2019.03.08-sdx”,:_ type =>“doc”,: routing => nil},#],:response => {“index”=> {“_ index”=>“filebeat-6.5.0-2019.03.08-sdx”,“_ type”=>“doc”,“id “=>”EhS6XGkBgE-US7A6Mvt“,”status“=> 400,”error“=> {”type“=>”mapper_parsing_exception“,”reason“=>”无法解析类型为[keyword]的字段[json.message] ]“,”caused_by“=> {”type“=>”illegal_state_exception“,”reason“=>”无法在START_OBJECT上获取文字1:461“}}}}}}
如果“message”字段是一个字符串(不是json),这个过滤器对我来说很好。
任何想法如何在“消息字段”中解析嵌套的json?
我有问题通过logstash来解析json。
我有一段时间一直在努力解决这个问题。并没有解决到logstash。
但幸运的是,我们在弹性搜索中拥有ingested node。
我想建议我解决你的问题:
你制作管道(非常简单的管道):
{
"description": "Parse JSON log",
"processors": [
{
"json": {
"field": "message",
"target_field": "message-json"
}
},
{
"remove": {
"field": "message"
}
},
{
"append": {
"field": "tags",
"value": [
"isjsonlog"
]
}
}
],
"on_failure": [
{
"append": {
"field": "tags",
"value": [
"nonjsonlog"
]
}
}
]
}
在您的输出插件中,您配置:
elasticsearch {
hosts => [ localhost ]
index => "filebeat-%{+YYYY.MM.dd}"
manage_template => false
pipeline => "your_pipeline_name"
}
你忘记了json解析的问题。
如果您通过配置filebeat使用filebeat you able to send json logs direct to pipeline:
output.elasticsearch:
....
pipelines:
- pipeline: "your pipeline name"
when:
contains:
message: "{"