FluentD 写日志到 Elasticsearch 失败

问题描述 投票:0回答:1

使用:

  • 流利的 1.11.2
  • fluent-plugin-elasticsearch 4.1.3
  • 弹性搜索 7.5.1
  • springboot 2.3.3

在 Openshift 中运行(Kubernetes v1.17.1+20ba474)。

Fluentd 和 Elasticsearch 都在不同的 pod 中运行。

Fluentd配置文件:

<source>
  @type forward
  port 24224
  bind 0.0.0.0
</source>
<filter *.**>
      @type parser
      key_name log
      reserve_data true
      <parse>
        @type none
      </parse>
</filter>
<match *.**>
  @type copy
<store>
    @type elasticsearch
    host elasticdb
    port 9200
    logstash_format true
    logstash_prefix applogs
    logstash_dateformat %Y%m%d
    include_tag_key true
    type_name app_log
    tag_key @log_name
    flush_interval 1s
    user elastic
    password changeme
  </store>
  <store>
    @type stdout
  </store>
</match>

从本地 springboot 服务,我正在向 fluentd 发送一些虚拟数据:

// Local port 24224 is being forwarded to remote 24224 via oc port-forward command
private static FluentLogger LOG = FluentLogger.getLogger("app", "127.0.0.1", 24224);

Map<String, Object> data = new HashMap<String, Object>();
data.put("from", "userA");
data.put("to", "userB");

LOG.log("app", data);

发送这段JSON数据:

{"from":"userA","to":"userB"}

显然,它只工作了十分之一。或者似乎工作了两到三遍然后就坏了,直到我更改索引。实际上,不清楚行为模式。

当它不起作用时(大多数时候),这些是 fluentd pod 中的日志:

2020-09-18 17:33:08.000000000 +0000 app.appaa: {"from":"userA","to":"userB"}
2020-09-18 17:33:37 +0000 [warn]: #0 dump an error event: error_class=ArgumentError error="log does not exist" location=nil tag="fluent.warn" time=2020-09-18 17:33:37.328180192 +0000 record={"error"=>"#<ArgumentError: log does not exist>", "location"=>nil, "tag"=>"app.appaa", "time"=>1600450388, "record"=>{"from"=>"userA", "to"=>"userB"}, "message"=>"dump an error event: error_class=ArgumentError error=\"log does not exist\" location=nil tag=\"app.appaa\" time=1600450388 record={\"from\"=>\"userAa\", \"to\"=>\"userBb\"}"}
2020-09-18 17:33:37.328180192 +0000 fluent.warn: {"error":"#<ArgumentError: log does not exist>","location":null,"tag":"app.appaa","time":1600450388,"record":{"from":"userA","to":"userB"},"message":"dump an error event: error_class=ArgumentError error=\"log does not exist\" location=nil tag=\"app.appaa\" time=1600450388 record={\"from\"=>\"userA\", \"to\"=>\"userB\"}"}
warning: 299 Elasticsearch-7.5.1-3ae9ac9a93c95bd0cdc054951cf95d88e1e18d96 "[types removal] Specifying types in bulk requests is deprecated."

虽然 Elasticsearch pod 没有显示任何内容(我猜是日志记录级别的问题),但如果我去 Elastic,我会看到这个:

{
    "_index": "applogs-20200918",
    "_type": "_doc",
    "_id": "F0M2onQBB89nIri4Cb1Z",
    "_score": 1.0,
    "_source": {
        "error": "#<ArgumentError: log does not exist>",
        "location": null,
        "tag": "app.app",
        "time": 1600449251,
        "record": {
            "from": "userA",
            "to": "userB"
        },
        "message": "dump an error event: error_class=ArgumentError error=\"log does not exist\" location=nil tag=\"app.app\" time=1600449251 record={\"from\"=>\"userA\", \"to\"=>\"userB\"}",
        "@timestamp": "2020-09-18T17:14:39.775332214+00:00",
        "@log_name": "fluent.warn"
    }
}

所以看起来错误来自

“弹性:参数错误:日志不存在”

以前有人遇到过这个错误吗?

spring elasticsearch openshift fluentd
1个回答
4
投票

过滤器中解析器的配置即

<filter *.**>
  @type parser
  key_name log    # << Look for key `log` in event
  # ...
</filter>

正在寻找本次活动中不存在的钥匙

log

{"from":"userA","to":"userB"}

你需要使用这样的东西:

{"log":"... your log here..."}

如果你使用引号,你可能需要在里面转义

"

相关文档:https://docs.fluentd.org/filter/parser#key_name

© www.soinside.com 2019 - 2024. All rights reserved.