Logstash 不会将 GeoIP 数据提取到 Elastic Search

问题描述 投票:0回答:1

我是 ELK 新手,我已经在 K8s 上部署了 Elastic 和 Kibana,我想监控在独立 Ubuntu 服务器中运行的独立 Nginx 服务器,我已经在该机器(Ubuntu 服务器)上安装了 filebeat 和 logstash 并进行了配置filebeat 收集 Nginx 访问日志并将其推送到 Logstash,我想用 GeoIP 数据丰富日志,以便我可以在 Kibana 上创建地图,下面是我的 Filebeat 和 Logstash 配置,我面临的问题是我我没有获取 GeoIP 数据,并且 ES 中没有创建 GeoIP 字段,当我在 Discovery 中检查 Nginx-* 数据视图时,我可以看到访问日志,但看不到与 GeoIP 相关的数据。

filebeat.yml

filebeat.inputs:
  - type: filestream
    enabled: true
    paths:
      - /var/log/nginx/*.log
    fields:
      log_type: nginx
    fields_under_root: true
    close_inactive: 24h
 

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false

output.logstash:
  hosts: ["localhost:5044"]

setup.kibana:
  host: "192.168.30.14:5601"
  username: "$user"
  password: "$mypassword"

logstash.conf

input {
  beats {
    port => 5044
  }
}

filter {
  if [fileset][module] == "nginx" {
    grok {
      match => { "message" => "%{COMBINEDAPACHELOG}" }
    }
    geoip {
      source => "message"
      target => "client_location"
      add_field => {
        "[client_location][coordinates][longtitude]" => "%{[geoip][longitude]}"
        "[client_location][coordinates][latitude]" => "%{[geoip][latitude]}"
      }
    }
  }
}

output {
  elasticsearch {
    hosts => ["https://192.168.30.12:9200"]
    index => "nginx-%{+YYYY.MM.dd}"
    user => "$username"
    password => "$mypassword"
    ssl => true
    ssl_certificate_authorities => "/$HOME/ca.crt"
    ssl_certificate_verification => false
  }
}

这是我在 ES 中得到的数据:

{
  "_index": "nginx-2023.09.05",
  "_id": "VteeZYoB1eIOU-vBqNoz",
  "_version": 1,
  "_score": 0,
  "_source": {
    "fileset": {
      "name": "access"
    },
    "@timestamp": "2023-09-05T13:53:29.941Z",
    "host": {
      "name": "ubuntu"
    },
    "message": "106.202.44.241 - - [05/Sep/2023:13:53:22 +0000] \"GET / HTTP/1.1\" 304 0 \"-\" \"Mozilla/5.0 (Linux; Android 10; K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Mobile Safari/537.36\"",
    "tags": [
      "beats_input_codec_plain_applied"
    ],
    "service": {
      "type": "nginx"
    },
    "ecs": {
      "version": "1.12.0"
    },
    "event": {
      "module": "nginx",
      "timezone": "+00:00",
      "original": "106.202.44.241 - - [05/Sep/2023:13:53:22 +0000] \"GET / HTTP/1.1\" 304 0 \"-\" \"Mozilla/5.0 (Linux; Android 10; K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Mobile Safari/537.36\"",
      "dataset": "nginx.access"
    },
    "log": {
      "offset": 937020,
      "file": {
        "path": "/var/log/nginx/access.log"
      }
    },
    "@version": "1",
    "agent": {
      "version": "8.9.1",
      "id": "518bc225-d4dd-4e3f-a334-b7aca04e1c43",
      "name": "ubuntu",
      "type": "filebeat",
      "ephemeral_id": "7a2ebdb4-de03-43bc-a547-2f6fc49b84b4"
    },
    "input": {
      "type": "log"
    }
  },
  "fields": {
    "agent.version.keyword": [
      "8.9.1"
    ],
    "service.type.keyword": [
      "nginx"
    ],
    "input.type.keyword": [
      "log"
    ],
    "host.name.keyword": [
      "ubuntu"
    ],
    "event.dataset.keyword": [
      "nginx.access"
    ],
    "tags.keyword": [
      "beats_input_codec_plain_applied"
    ],
    "service.type": [
      "nginx"
    ],
    "agent.type": [
      "filebeat"
    ],
    "ecs.version.keyword": [
      "1.12.0"
    ],
    "event.module": [
      "nginx"
    ],
    "@version": [
      "1"
    ],
    "agent.name": [
      "ubuntu"
    ],
    "host.name": [
      "ubuntu"
    ],
    "event.timezone": [
      "+00:00"
    ],
    "log.file.path.keyword": [
      "/var/log/nginx/access.log"
    ],
    "agent.type.keyword": [
      "filebeat"
    ],
    "agent.ephemeral_id.keyword": [
      "7a2ebdb4-de03-43bc-a547-2f6fc49b84b4"
    ],
    "event.original": [
      "106.202.44.241 - - [05/Sep/2023:13:53:22 +0000] \"GET / HTTP/1.1\" 304 0 \"-\" \"Mozilla/5.0 (Linux; Android 10; K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Mobile Safari/537.36\""
    ],
    "agent.name.keyword": [
      "ubuntu"
    ],
    "agent.id.keyword": [
      "518bc225-d4dd-4e3f-a334-b7aca04e1c43"
    ],
    "fileset.name": [
      "access"
    ],
    "@version.keyword": [
      "1"
    ],
    "input.type": [
      "log"
    ],
    "log.offset": [
      937020
    ],
    "message": [
      "106.202.44.241 - - [05/Sep/2023:13:53:22 +0000] \"GET / HTTP/1.1\" 304 0 \"-\" \"Mozilla/5.0 (Linux; Android 10; K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Mobile Safari/537.36\""
    ],
    "tags": [
      "beats_input_codec_plain_applied"
    ],
    "fileset.name.keyword": [
      "access"
    ],
    "@timestamp": [
      "2023-09-05T13:53:29.941Z"
    ],
    "agent.id": [
      "518bc225-d4dd-4e3f-a334-b7aca04e1c43"
    ],
    "ecs.version": [
      "1.12.0"
    ],
    "event.original.keyword": [
      "106.202.44.241 - - [05/Sep/2023:13:53:22 +0000] \"GET / HTTP/1.1\" 304 0 \"-\" \"Mozilla/5.0 (Linux; Android 10; K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Mobile Safari/537.36\""
    ],
    "log.file.path": [
      "/var/log/nginx/access.log"
    ],
    "message.keyword": [
      "106.202.44.241 - - [05/Sep/2023:13:53:22 +0000] \"GET / HTTP/1.1\" 304 0 \"-\" \"Mozilla/5.0 (Linux; Android 10; K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Mobile Safari/537.36\""
    ],
    "event.module.keyword": [
      "nginx"
    ],
    "agent.ephemeral_id": [
      "7a2ebdb4-de03-43bc-a547-2f6fc49b84b4"
    ],
    "agent.version": [
      "8.9.1"
    ],
    "event.dataset": [
      "nginx.access"
    ],
    "event.timezone.keyword": [
      "+00:00"
    ]
  }
}
elasticsearch logstash kibana elk geoip
1个回答
0
投票

首先,启用 geoip 过滤器的条件适用于文档中不存在的字段:

[fileset][module]
,因此没有事件通过您的 geoip 过滤器。

此外,一旦你解决了这个问题,

COMBINEDAPACHELOG
显然不会像你期望的那样解析你的nginx访问日志(Apache!= Nginx),因此geoip将解析的
message
字段将不包含有效的IP地址。

总而言之,您需要:

  1. 更改条件以通过geoip过滤器运行
  2. 找到正确的 grok 模式来解析你的 nginx 访问日志
  3. 将 geoip 使用的源字段更改为包含有效 IP 的字段
© www.soinside.com 2019 - 2024. All rights reserved.