Grok 语法,如果它是新条目或前一个条目的延续,则不同

问题描述 投票:0回答:1

我使用 Grok 作为默认插件来过滤我的日志,所以让我说我有一个简单的 3 个日志条目:

2023-08-17 10:10:50.751 +02:00 [WARNING] [Provider] Failed to collect
2023-08-17 10:10:50.751 +02:00 [Error] [Provider] Failed to collect
AdapterException: Connection from Adapter to turbine could not be established
   at IsReadyAsync(CancellationToken token) in C:\server\Connection.cs:line 403
   at AlarmsAsync(CancellationToken token) in C:\server\Connection.cs:line 242
   at AlarmsAsync(CancellationToken token) in C:\server\Connection.cs:line 256
   at EventsAsync(Unit unit, LiveEventSubscriptionData eventData, CancellationToken token) in C:\server\Events.cs:line 55
2023-08-17 10:10:50.751 +02:00 [WARNING] [Provider] Failed to collect

因此,为了支持多行异常阅读,我创建了这个 Grok 表达式:

filter {
      grok {
        match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{ISO8601_TIMEZONE:timezone} \[%{WORD:level}\] \[%{GREEDYDATA:source}\] %{GREEDYDATA:message}(?<message>(.|\r|\n)*)" }
      }
}

但是现在问题就出在这里,它读起来是 3 个条目,如下所示:

{
  "source": "Provider",
  "message": [
    "Failed to collect",
    "\n2023-08-17 10:10:50.751 +02:00 [Error] [Provider] Failed to collect\nAdapterException: Connection from Adapter to turbine could not be established\n   at IsReadyAsync(CancellationToken token) in C:\\server\\Connection.cs:line 403\n   at AlarmsAsync(CancellationToken token) in C:\\server\\Connection.cs:line 242\n   at AlarmsAsync(CancellationToken token) in C:\\server\\Connection.cs:line 256\n   at EventsAsync(Unit unit, LiveEventSubscriptionData eventData, CancellationToken token) in C:\\server\\Events.cs:line 55\n2023-08-17 10:10:50.751 +02:00 [WARNING] [Provider] Failed to collect"
  ],
  "level": "WARNING",
  "timezone": "+02:00",
  "timestamp": "2023-08-17 10:10:50.751"
}

还尝试添加多行编解码器,但没有成功:

input {
 file {
   mode => "tail"
   path => "/usr/share/logstash/ingest_data/*"
   codec => multiline {
       pattern => "%{TIMESTAMP_ISO8601}"
       negate => true
       what => "previous"
   }
 }
}

所以,当我陷入困境时,有没有一种方法可以通知表达式女巫行是配置,并且其中一个是新的日志条目?

elasticsearch logstash elastic-stack logstash-grok
1个回答
0
投票

所以我发现这个在配置中支持Logstash中的多行,需要在输入文件配置中进行codex,如下所示:

codec => multiline {
   pattern => "^%{TIMESTAMP_ISO8601}" <- pattern defining how new line starts
   negate => true
   what => "previous"
}

因此过滤器看起来像这样:

%{TIMESTAMP_ISO8601:event_time} %{ISO8601_TIMEZONE:timezone} \[%{WORD:level}\] \[%{GREEDYDATA:source}\] %{GREEDYDATA:message}
© www.soinside.com 2019 - 2024. All rights reserved.