Logstash 忽略 CSV 中的新行

问题描述 投票:0回答:1

我正在尝试通过 Logstash 将 CSV 文件上传到 ElasticSearch,Logstash 似乎忽略了 CSV 文件的新行。

root@debian:~# cat jmeter-localhost-dummy.csv
23c2a43199061f723b85832e49d5ff1cdc6590b085b1fdf17629e36d70c1ff68@@2023-11-08 11:10:09,689@@influxdb@@test@@dummy@@localhost@@INFO@@o.a.j.s.SampleResult: sampleresult.nanoThreadSleep=5000
cb17fb2a09a3f3cdb756792b871b488c551172cafe3bcf14aa9c1ed777a34df6@@2023-11-08 11:10:09,734@@influxdb@@test@@dummy@@localhost@@INFO@@o.a.j.t.ThreadGroup: Started thread group number 1
...
And so on ..

root@debian:~# cat /etc/logstash/conf.d/http-pipeline.conf
input {
  http {
    host => "0.0.0.0"
    port => "3101"
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost/"]
    ssl => false
    ssl_certificate_verification => false
    index => "logstash"
    document_id => "%{sha256}"
  }
  stdout {
  }
}

filter {
  csv {
    skip_header => "true"
    columns => ["sha256", "datetime", "level", "campagne", "env", "tir", "injecteur", "message"]
    separator => "@@"
  }
  date {
    match => [ "datetime", "yyyy-MM-dd HH:mm:ss','SSS" ]
    timezone => "Europe/Paris"
    target => "@timestamp"
  }
  mutate {
    remove_field => [ "datetime" ]
  }
}

我通过 POST API (http://localhost:3101/) 上传文件,但在 ElasticSearch 中只得到 1 条记录...我做错了什么?有什么我错过或误解的吗?

elasticsearch logstash
1个回答
0
投票

您可能想研究一下编解码器?

codec => line

这看起来像

input {
  http {
    host => "0.0.0.0"
    port => "3101"
    codec => line
  }
}

© www.soinside.com 2019 - 2024. All rights reserved.