我正在通过 logstash 将数据推送到 elastic,在 logstash 中我将数据作为父子进行了聚合。
这个推送数据但是在加法时遗漏了一些数据,例如我有一张卡并且有 10 章然后它从来没有将 10 章推送到“章节”有时它只推送 3,有时 5,有时 7。 但是在某些卡片中它会推送所有章节,这表现得很奇怪并且没有固定的模式。 我没有找到发生这种情况的原因,专家可以指导我哪里可能有问题以及我在这方面缺少什么。
logstash config file is below
input {
jdbc {
jdbc_connection_string => "my connection string"
jdbc_user => "my user"
jdbc_password => "my_password"
jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/jdbc-mssql.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
last_run_metadata_path => "/etc/logstash/.logstash_jdbc_last_run_qa_flashcard"
type => "card"
statement => "select * from carddetailview where dateupdated > :sql_last_value"
tracking_column => "dateupdated"
tracking_column_type => "timestamp"
use_column_value => "true"
schedule => "* * * * *"
}
}
filter{
aggregate {
task_id => "%{cardid}"
code => "
map['cardid'] = event.get('cardid')
map['topicname'] = event.get('topicname')
map['l1subject'] = event.get('l1subject')
map['l2subject'] = event.get('l2subject')
map['dateupdated'] = event.get('dateupdated')
map['chapters'] ||= []
map['chapters'] <<
{
'detailid' => event.get('detailid'),
'front' => event.get('front'),
'back' => event.get('back')
}
"
push_previous_map_as_event => true
timeout => 60
timeout_tags => ['aggregated']
}
mutate { remove_field => ["detailid", "front", "back"] }
}
output {
elasticsearch {
hosts => [ "http://localhost:9200" ]
user => 'elastic'
password => 'my_password'
index => "myindex"
document_type => "card"
document_id => "%{cardid}"
}
stdout { codec => "json_lines" }
}
下面是elastic中插入的数据。这里只有 4 章被推到弹性,但根据我的数据,这些应该是 21 章。有时推送 5 章,有时推送 6 章,有时推送 9 章。
{
"userid" : 20,
"chapters" : [
{
"carddetailid" : 1246,
"backscore" : 36,
"front" : "Flamebait",
"back" : "A"
},
{
"carddetailid" : 1247,
"backscore" : 42,
"front" : "Meme",
"back" : "B"
},
{
"carddetailid" : 1248,
"backscore" : 40,
"front" : "Posts",
"back" : "C"
},
{
"carddetailid" : 1249,
"backscore" : 38,
"front" : "Chats",
"back" : "D"
}
],
"keyword" : "A program that appears desirable",
"cardid" : "1"
}