Fluentbit收集的Serilog日志到kubernetes中的Elasticsearch并没有正确地解析Json

问题描述 投票:1回答:1

在Kubernetes(Minikube)上使用EFK Stack。有一个asp.net核心应用程序使用Serilog写入控制台作为Json。日志DO发送到Elasticsearch,但它们到达未解析的字符串,进入“日志”字段,这就是问题所在。

这是控制台输出:

{
    "@timestamp": "2019-03-22T22:08:24.6499272+01:00",
    "level": "Fatal",
    "messageTemplate": "Text: {Message}",
    "message": "Text: \"aaaa\"",
    "exception": {
        "Depth": 0,
        "ClassName": "",
        "Message": "Boom!",
        "Source": null,
        "StackTraceString": null,
        "RemoteStackTraceString": "",
        "RemoteStackIndex": -1,
        "HResult": -2146232832,
        "HelpURL": null
    },
    "fields": {
        "Message": "aaaa",
        "SourceContext": "frontend.values.web.Controllers.HomeController",
        "ActionId": "0a0967e8-be30-4658-8663-2a1fd7d9eb53",
        "ActionName": "frontend.values.web.Controllers.HomeController.WriteTrace (frontend.values.web)",
        "RequestId": "0HLLF1A02IS16:00000005",
        "RequestPath": "/Home/WriteTrace",
        "CorrelationId": null,
        "ConnectionId": "0HLLF1A02IS16",
        "ExceptionDetail": {
            "HResult": -2146232832,
            "Message": "Boom!",
            "Source": null,
            "Type": "System.ApplicationException"
        }
    }
}

这是Program.cs,Serilog配置的一部分(ExceptionAsObjectJsonFormatter继承自ElasticsearchJsonFormatter):

.UseSerilog((ctx, config) =>
{
    var shouldFormatElastic = ctx.Configuration.GetValue<bool>("LOG_ELASTICFORMAT", false);
    config
        .ReadFrom.Configuration(ctx.Configuration) // Read from appsettings and env, cmdline
        .Enrich.FromLogContext()
        .Enrich.WithExceptionDetails();

    var logFormatter = new ExceptionAsObjectJsonFormatter(renderMessage: true);
    var logMessageTemplate = "[{Timestamp:HH:mm:ss} {Level:u3}] {Message:lj}{NewLine}{Exception}";

    if (shouldFormatElastic)
        config.WriteTo.Console(logFormatter, standardErrorFromLevel: LogEventLevel.Error);
    else
        config.WriteTo.Console(standardErrorFromLevel: LogEventLevel.Error, outputTemplate: logMessageTemplate);

})

使用这些nuget包:

  • Serilog.AspNetCore
  • Serilog.Exceptions
  • Serilog.Formatting.Elasticsearch
  • Serilog.Settings.Configuration
  • Serilog.Sinks.Console

这就是它在Kibana Here中的样子

这是fluent-bit的configmap:

fluent-bit-filter.conf:
[FILTER]
    Name                kubernetes
    Match               kube.*
    Kube_URL            https://kubernetes.default.svc:443
    Kube_CA_File        /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
    Kube_Token_File     /var/run/secrets/kubernetes.io/serviceaccount/token
    Merge_Log           On
    K8S-Logging.Parser  On
    K8S-Logging.Exclude On

fluent-bit-input.conf:
[INPUT]
    Name             tail
    Path             /var/log/containers/*.log
    Parser           docker
    Tag              kube.*
    Refresh_Interval 5
    Mem_Buf_Limit    5MB
    Skip_Long_Lines  On

fluent-bit-output.conf:

[OUTPUT]
    Name  es
    Match *
    Host  elasticsearch
    Port  9200
    Logstash_Format On
    Retry_Limit False
    Type  flb_type
    Time_Key @timestamp
    Replace_Dots On
    Logstash_Prefix kubernetes_cluster




fluent-bit-service.conf:
[SERVICE]
    Flush        1 
    Daemon       Off
    Log_Level    info
    Parsers_File parsers.conf
fluent-bit.conf:
@INCLUDE fluent-bit-service.conf
@INCLUDE fluent-bit-input.conf
@INCLUDE fluent-bit-filter.conf
@INCLUDE fluent-bit-output.conf
parsers.conf:

但我也尝试了https://raw.githubusercontent.com/fluent/fluent-bit-kubernetes-logging/master/output/elasticsearch/fluent-bit-configmap.yaml修改。

我用Helm用helm install stable/fluent-bit --name=fluent-bit --namespace=logging --set backend.type=es --set backend.es.host=elasticsearch --set on_minikube=true安装了fluentbit

我也收到了很多以下错误:

log:{"took":0,"errors":true,"items":[{"index":{"_index":"kubernetes_cluster-2019.03.22","_type":"flb_type","_id":"YWCOp2kB4wEngjaDvxNB","status":400,"error":{"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"json_parse_exception","reason":"Duplicate field '@timestamp' at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@432f75a7; line: 1, column: 1248]"}}}}]}

log:[2019/03/22 22:38:57] [error] [out_es] could not pack/validate JSON response stream:stderr

正如我在Kibana看到的那样。

elasticsearch asp.net-core kubernetes serilog fluent-bit
1个回答
3
投票

问题是糟糕的fluentbit configmap。这有效:

apiVersion: v1
kind: ConfigMap
metadata:
  name: fluent-bit-config
  namespace: logging
  labels:
    k8s-app: fluent-bit
data:
  # Configuration files: server, input, filters and output
  # ======================================================
  fluent-bit.conf: |
    [SERVICE]
        Flush         1
        Log_Level     info
        Daemon        off
        Parsers_File  parsers.conf
        HTTP_Server   On
        HTTP_Listen   0.0.0.0
        HTTP_Port     2020        
    @INCLUDE input-kubernetes.conf
    @INCLUDE filter-kubernetes.conf
    @INCLUDE output-elasticsearch.conf
  input-kubernetes.conf: |
    [INPUT]
        Name              tail
        Tag               kube.*
        Path              /var/log/containers/*.log
        Parser            docker
        DB                /var/log/flb_kube.db
        Mem_Buf_Limit     5MB
        Skip_Long_Lines   On
        Refresh_Interval  10
  filter-kubernetes.conf: |
    [FILTER]
        Name                kubernetes
        Match               kube.*
        Kube_URL            https://kubernetes.default.svc:443
        # These two may fix some duplicate field exception
        Merge_Log           On
        Merge_JSON_Key      k8s
        K8S-Logging.Parser  On
        K8S-Logging.exclude True
  output-elasticsearch.conf: |
    [OUTPUT]
        Name            es
        Match           *
        Host            ${FLUENT_ELASTICSEARCH_HOST}
        Port            ${FLUENT_ELASTICSEARCH_PORT}
        Logstash_Format On
        # This fixes errors where kubernetes.apps.name must object
        Replace_Dots    On 
        Retry_Limit     False
        Type            flb_type
        # This may fix some duplicate field exception
        Time_Key        @timestamp_es
        # The Index Prefix:
        Logstash_Prefix logstash_07
  parsers.conf: |
    [PARSER]
        Name   apache
        Format regex
        Regex  ^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^\"]*?)(?: +\S*)?)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   apache2
        Format regex
        Regex  ^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^ ]*) +\S*)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   apache_error
        Format regex
        Regex  ^\[[^ ]* (?<time>[^\]]*)\] \[(?<level>[^\]]*)\](?: \[pid (?<pid>[^\]]*)\])?( \[client (?<client>[^\]]*)\])? (?<message>.*)$
    [PARSER]
        Name   nginx
        Format regex
        Regex ^(?<remote>[^ ]*) (?<host>[^ ]*) (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^\"]*?)(?: +\S*)?)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   json
        Format json
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name        docker
        Format      json
        #Time_Key    time
        Time_Key    @timestamp
        Time_Format %Y-%m-%dT%H:%M:%S.%L
        Time_Keep   Off # on
        # See: https://fluentbit.io/documentation/0.14/parser/decoder.html
        # Command      |  Decoder | Field | Optional Action
        # =============|==================|=================
        # Decode_Field_As   escaped    log
        # Decode_Field_As   escaped    log    do_next
        # Decode_Field_As   json       log     
    [PARSER]
        Name        syslog
        Format      regex
        Regex       ^\<(?<pri>[0-9]+)\>(?<time>[^ ]* {1,2}[^ ]* [^ ]*) (?<host>[^ ]*) (?<ident>[a-zA-Z0-9_\/\.\-]*)(?:\[(?<pid>[0-9]+)\])?(?:[^\:]*\:)? *(?<message>.*)$
        Time_Key    time
        Time_Format %b %d %H:%M:%S
© www.soinside.com 2019 - 2024. All rights reserved.