Dockerlogstash未将定义的日志注册到elasticsearch

问题描述 投票:0回答:1

我是ELK的新人。我想根据本教程使用 docker compose 建立一个服务器。我的 Spring Boot 微服务在主机

./logstash/logs
处写入日志文件,因此我将其映射到 Logstash 容器,如下所示:

logstash:
   ...
   volumes:
     ...
     - ./logstash/logs:/usr/share/logstash/logs:ro,Z

我改变了

logstash.conf
如下:

input {
    file {
        type => "log"
        path => "/usr/share/logstash/logs/my-service.log"
    }
}
## Add your filters / logstash plugins configuration here

output {
    elasticsearch {
        hosts => "elasticsearch:9200"
        index => "firs-log"
        user => "logstash_internal"
        password => "${LOGSTASH_INTERNAL_PASSWORD}"
    }
}

当我运行容器时,elasticsearch 中没有创建索引:

我的logstash日志如下:

> 2024-02-24 14:36:45
> /usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13:
> warning: method redefined; discarding old to_int 2024-02-24 14:36:45
> /usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/concurrent-ruby-1.1.9/lib/concurrent-ruby/concurrent/executor/java_thread_pool_executor.rb:13:
> warning: method redefined; discarding old to_f 2024-02-24 14:36:39
> Using bundled JDK: /usr/share/logstash/jdk 2024-02-24 14:36:47 Sending
> Logstash logs to /usr/share/logstash/logs which is now configured via
> log4j2.properties 2024-02-24 14:36:48 [2024-02-24T11:06:48,011][INFO
> ][logstash.runner          ] Log4j configuration path used is:
> /usr/share/logstash/config/log4j2.properties 2024-02-24 14:36:48
> [2024-02-24T11:06:48,014][INFO ][logstash.runner          ] Starting
> Logstash {"logstash.version"=>"8.12.1", "jruby.version"=>"jruby
> 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.10+7 on 17.0.10+7 +indy +jit [x86_64-linux]"} 2024-02-24 14:36:48 [2024-02-24T11:06:48,016][INFO ][logstash.runner          ] JVM
> bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true,
> -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000,
> -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Dls.cgroup.cpuacct.path.override=/, -Dls.cgroup.cpu.path.override=/, -Xms256m, -Xmx256m, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED] 2024-02-24 14:36:48 [2024-02-24T11:06:48,017][INFO ][logstash.runner          ]
> Jackson default value override
> `logstash.jackson.stream-read-constraints.max-string-length`
> configured to `200000000` 2024-02-24 14:36:48
> [2024-02-24T11:06:48,018][INFO ][logstash.runner          ] Jackson
> default value override
> `logstash.jackson.stream-read-constraints.max-number-length`
> configured to `10000` 2024-02-24 14:36:48
> [2024-02-24T11:06:48,024][INFO ][logstash.settings        ] Creating
> directory {:setting=>"path.queue",
> :path=>"/usr/share/logstash/data/queue"} 2024-02-24 14:36:48
> [2024-02-24T11:06:48,026][INFO ][logstash.settings        ] Creating
> directory {:setting=>"path.dead_letter_queue",
> :path=>"/usr/share/logstash/data/dead_letter_queue"} 2024-02-24
> 14:36:48 [2024-02-24T11:06:48,126][INFO ][logstash.agent           ]
> No persistent UUID file found. Generating new UUID
> {:uuid=>"8a4c26a8-9959-48ea-abf5-cba7f4e0e008",
> :path=>"/usr/share/logstash/data/uuid"} 2024-02-24 14:36:48
> [2024-02-24T11:06:48,454][INFO ][logstash.agent           ]
> Successfully started Logstash API endpoint {:port=>9600,
> :ssl_enabled=>false} 2024-02-24 14:36:48
> [2024-02-24T11:06:48,700][INFO ][org.reflections.Reflections]
> Reflections took 70 ms to scan 1 urls, producing 132 keys and 468
> values 2024-02-24 14:36:48 [2024-02-24T11:06:48,864][INFO
> ][logstash.javapipeline    ] Pipeline `main` is configured with
> `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline
> will default to `ecs_compatibility => v8` unless explicitly configured
> otherwise. 2024-02-24 14:36:48 [2024-02-24T11:06:48,875][INFO
> ][logstash.outputs.elasticsearch][main] New Elasticsearch output
> {:class=>"LogStash::Outputs::ElasticSearch",
> :hosts=>["//elasticsearch:9200"]} 2024-02-24 14:36:48
> [2024-02-24T11:06:48,955][INFO ][logstash.outputs.elasticsearch][main]
> Elasticsearch pool URLs updated {:changes=>{:removed=>[],
> :added=>[http://logstash_internal:xxxxxx@elasticsearch:9200/]}}
> 2024-02-24 14:36:49 [2024-02-24T11:06:49,088][WARN
> ][logstash.outputs.elasticsearch][main] Restored connection to ES
> instance {:url=>"http://logstash_internal:xxxxxx@elasticsearch:9200/"}
> 2024-02-24 14:36:49 [2024-02-24T11:06:49,089][INFO
> ][logstash.outputs.elasticsearch][main] Elasticsearch version
> determined (8.12.1) {:es_version=>8} 2024-02-24 14:36:49
> [2024-02-24T11:06:49,089][WARN ][logstash.outputs.elasticsearch][main]
> Detected a 6.x and above cluster: the `type` event field won't be used
> to determine the document _type {:es_version=>8} 2024-02-24 14:36:49
> [2024-02-24T11:06:49,101][INFO ][logstash.outputs.elasticsearch][main]
> Not eligible for data streams because config contains one or more
> settings that are not compatible with data streams:
> {"index"=>"firs-log"} 2024-02-24 14:36:49
> [2024-02-24T11:06:49,101][INFO ][logstash.outputs.elasticsearch][main]
> Data streams auto configuration (`data_stream => auto` or unset)
> resolved to `false` 2024-02-24 14:36:49 [2024-02-24T11:06:49,109][INFO
> ][logstash.outputs.elasticsearch][main] Using a default mapping
> template {:es_version=>8, :ecs_compatibility=>:v8} 2024-02-24 14:36:49
> [2024-02-24T11:06:49,112][INFO ][logstash.javapipeline    ][main]
> Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>16,
> "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50,
> "pipeline.max_inflight"=>2000,
> "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"],
> :thread=>"#<Thread:0x77c98da8
> /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134
> run>"} 2024-02-24 14:36:49 [2024-02-24T11:06:49,151][INFO
> ][logstash.outputs.elasticsearch][main] Installing Elasticsearch
> template {:name=>"ecs-logstash"} 2024-02-24 14:36:49
> [2024-02-24T11:06:49,672][INFO ][logstash.javapipeline    ][main]
> Pipeline Java execution initialization time {"seconds"=>0.56}
> 2024-02-24 14:36:49 [2024-02-24T11:06:49,681][INFO
> ][logstash.inputs.file     ][main] No sincedb_path set, generating one
> based on the "path" setting
> {:sincedb_path=>"/usr/share/logstash/data/plugins/inputs/file/.sincedb_9391fb0a817f4b225df7fa90e990b01b",
> :path=>["/usr/share/logstash/logs/my-service.log"]} 2024-02-24
> 14:36:49 [2024-02-24T11:06:49,683][INFO ][logstash.javapipeline   
> ][main] Pipeline started {"pipeline.id"=>"main"} 2024-02-24 14:36:49
> [2024-02-24T11:06:49,688][INFO ][filewatch.observingtail 
> ][main][cbe1433791fc583a55e100262676d95761e4d64e46aab200140ca5435ba5c2c8]
> START, creating Discoverer, Watch with file and sincedb collections
> 2024-02-24 14:36:49 [2024-02-24T11:06:49,696][INFO ][logstash.agent   
> ] Pipelines running {:count=>1, :running_pipelines=>[:main],
> :non_running_pipelines=>[]}

问题出在哪里?为什么

my-service.log
不通过
logstash
注册?

我想通过

my-service.log
logstash
注册
elasticsearch

docker-compose logstash elk
1个回答
0
投票

您的“logstash_internal”的角色可能会导致问题。我建议您检查 Kibana 中的角色和用户。堆栈管理 -> Kibana -> 用户或角色。

通常有内置用户,但如果您使用自定义用户,则应将角色分配给该用户,以使 Logstash 能够读取日志文件。

© www.soinside.com 2019 - 2024. All rights reserved.