Filebeat,用于将消息从 Kafka 主题传送到 Elasticsearch

问题描述 投票:0回答:0

我想让 Filebeat 设置它使用来自 Kafka 主题的消息然后将它们发送到 Elasticsearch 的方式。有没有办法,如何在 docker-compose 中全部设置?我可以将 Logstash 注册为 Kafka 消费者,因此我认为 Filebeat 也应该可以。在日志中,我可以看到它加载了配置文件。但是,我没有在 Kafka 的消费者列表中看到它。所以 Filebeat 不会消费任何消息,即使它们在 Kafka 中也是如此。有人能指出我正确的方向吗?我认为它应该是一个非常基本的配置。然而,我错过了一些东西。

docker-compose.yml:

version: '3'
services:
  PostgreSQL:
    image: postgres:latest
    environment:
      - POSTGRES_DB=distillery
      - POSTGRES_USER=admin
      - POSTGRES_PASSWORD=secret
    ports:
      - "5434:5432"

  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000
    ports:
      - "22181:2181"

  kafka:
    image: confluentinc/cp-kafka:latest
    depends_on:
      - zookeeper
    ports:
      - "29092:29092"
    environment:
      KAFKA_BROKER_ID: 1
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS:         PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP:     PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT
      KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
      KAFKA_LOG_RETENTION_MS: 10000
      KAFKA_LOG_RETENTION_CHECK_INTERVAL_MS: 5000

  elasticsearch:
    image: elasticsearch:7.9.2
    depends_on:
      - kafka
    ports:
      - '9200:9200'
    environment:
      - discovery.type=single-node
    limits:
      memlock:
        soft: -1
        hard: -1

  filebeat:
    depends_on:
      - kafka
    image: docker.elastic.co/beats/filebeat:7.9.2
    container_name: filebeat
    volumes:
      - "./filebeat.yml"

filebeat.yml文件内容:

  filebeat.inputs:
    - type: kafka
      hosts:
        - kafka:29092
      topics: ["progress-raspberry"]
      client_id: "filebeat"
      group_id: "filebeat"

 output.elasticsearch:
    hosts: ["localhost:9200"]`
elasticsearch apache-kafka docker-compose filebeat elk
© www.soinside.com 2019 - 2024. All rights reserved.