过滤有或没有Logstash的Filebeat输入

问题描述 投票:0回答:2

在当前设置中,我们使用Filebeat将日志发送到Elasticsearch实例。应用程序日志为JSON格式,并在AWS中运行。

出于某种原因,AWS决定在新平台版本中为日志行添加前缀,现在日志解析不起作用。

Apr 17 06:33:32 ip-172-31-35-113 web: {"@timestamp":"2020-04-17T06:33:32.691Z","@version":"1","message":"Tomcat started on port(s): 5000 (http) with context path ''","logger_name":"org.springframework.boot.web.embedded.tomcat.TomcatWebServer","thread_name":"main","level":"INFO","level_value":20000}

之前只是:

{"@timestamp":"2020-04-17T06:33:32.691Z","@version":"1","message":"Tomcat started on port(s): 5000 (http) with context path ''","logger_name":"org.springframework.boot.web.embedded.tomcat.TomcatWebServer","thread_name":"main","level":"INFO","level_value":20000}

问题是我们是否可以避免使用Logstash将日志行转换为旧格式?如果没有,如何删除前缀?哪个过滤器是最佳选择?

我当前的Filebeat配置如下:

 filebeat.inputs:
  - type: log
    paths:
    - /var/log/web-1.log
    json.keys_under_root: true
    json.ignore_decoding_error: true
    json.overwrite_keys: true
    fields_under_root: true
    fields:
      environment: ${ENV_NAME:not_set}
      app: myapp

  cloud.id: "${ELASTIC_CLOUD_ID:not_set}"
  cloud.auth: "${ELASTIC_CLOUD_AUTH:not_set}"
elasticsearch logstash amazon-elastic-beanstalk filebeat
2个回答
0
投票
processors: # first ignore the preamble and only keep the JSON data - dissect: tokenizer: "%{?ignore} %{+ignore} %{+ignore} %{+ignore} %{+ignore}: %{json}" field: "message" target_prefix: "" # then parse the JSON data - decode_json_fields: fields: ["json"] process_array: false max_depth: 1 target: "" overwrite_keys: false add_error_key: true

如果不想包含行的开始部分,请使用Logstash中的JSON filter。就像这样:

filter {
    json {
        source => "message"
    }
}

也许在Filebeat中也提供了这两个功能。但是根据我的经验,在解析/操作日志数据时,我更喜欢使用Logstash。

© www.soinside.com 2019 - 2024. All rights reserved.