Logstash 配置不会解析 JSON 文件,也不会将数据推送到索引 - log: {:count=>1,:running_pipelines=>[:main], :non_running_pipelines=>[]}

问题描述 投票:0回答:1

我通过 Python 代码每 5 分钟生成一个 JSON 文件,并尝试将数据推送到 Elastic,但 Logstash 抛出以下消息,并且不将任何数据推送到 Kibana。

我的管道:文件 --> Logstash --> Elastic --> Kibana

日志消息:

[INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

我的 Logtash.conf 文件

input {
        file {
                path => "D:/elk/logs_folder/test.json"
                start_position => "beginning"
                codec => "json"
        }
}

filter {
  json {
    skip_on_invalid_json => true
    source => "message"
    target => "jsonData"
    add_tag => [ "_message_json_parsed" ]    
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "in_elk_test"
  }

  stdout{

  }
}

运行 Logstash 时出现以下错误

日志:

[INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.9.0) {:es_version=>8}
[WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"in_elk_test"}
[INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>16, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2000, "pipeline.sources"=>["D:/elk/logstash-8.9.0-windows-x86_64/logstash-8.9.0/logstash-simple.conf"], :thread=>"#<Thread:0x50d159d0@D:/elk/logstash-8.9.0-windows-x86_64/logstash-8.9.0/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.05}
[INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"D:/elk/logstash-8.9.0-windows-x86_64/logstash-8.9.0/data/plugins/inputs/file/.sincedb_f2779ebeeb58467d208ce626cfa73491", :path=>["D:/elk/logs_folder/logs1.json"]}
[INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[INFO ][filewatch.observingtail  ][main][223ec84e00c300043960ade7a8b1b9aa2a896b167223b1bf197e641e0ac119cd] START, creating Discoverer, Watch with file and sincedb collections
[INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

在该语句之后,Logstash 不会解析我的 JSON 文件,也不会将数据推送到索引。

请帮助我找出问题并解决它,提前致谢!

elasticsearch logstash logstash-configuration logstash-file logstash-filter
1个回答
0
投票

[INFO ][logstash.inputs.file ][main] 没有设置sincedb_path,根据“path”设置生成一个{:sincedb_path=>"D:/elk/logstash-8.9.0-windows-x86_64/logstash- 8.9.0/data/plugins/inputs/file/.sincedb_f2779ebeeb58467d208ce626cfa73491",:path=>["D:/elk/logs_folder/logs1.json"]}

在文件输入中,您还需要设置

sincedb_path
以确保从头开始读取文件,否则如果您启动了 Logstash 几次,它将从文件末尾读取

    file {
            path => "D:/elk/logs_folder/test.json"
            start_position => "beginning"
            sincedb_path => "NUL"
            codec => "json"
    }
© www.soinside.com 2019 - 2024. All rights reserved.