我的 linux 虚拟机安装了 Linux Azure Diagnotics 扩展,并配置为将 syslog 消息推送到事件枢纽。
我可以在事件枢纽进程数据刀片上查看我的 syslog 消息。现在,我试图将这些日志发送到 Azure Data Explorer,为此我遵循了以下步骤。
Syslog
)和表(SyslogTable
)来存储syslog消息。一切都很顺利,没有任何错误,因为 .show ingestion failures
不显示任何错误,但我无法看到ADX表的任何数据。
以下是配置示例。
以Json格式从Event Hub查看的数据样本
{
"time": "2020-05-18T15:54:01.0000000Z",
"resourceId": "/subscriptions/xxxxx/resourceGroups/xxxx/providers/Microsoft.Compute/virtualMachines/vmname",
"properties": {
"ident": "systemd",
"Ignore": "syslog",
"Facility": "daemon",
"Severity": "info",
"EventTime": "2020-05-18T15:54:01.0000000",
"SendingHost": "localhost",
"Msg": "Removed slice User Slice of root.",
"hostname": "vmname",
"FluentdIngestTimestamp": "2020-05-18T15:54:01.0000000Z"
},
"category": "daemon",
"level": "info",
"operationName": "LinuxSyslogEvent",
"EventProcessedUtcTime": "2020-05-19T07:39:48.5220591Z",
"PartitionId": 0,
"EventEnqueuedUtcTime": "2020-05-18T15:54:05.4390000Z"
}
ADX表模式
.create table SyslogTable (
eventTime: datetime,
resourceId: string,
properties: dynamic ,
category: string,
level: string,
operationName: string,
EventProcessedUtcTime: string,
PartitionId: int,
EventEnqueuedUtcTime: datetime
)
ADX系统日志表映射
.create table SyslogTable ingestion json mapping "SyslogMapping"
'['
' {"column":"eventTime", "Properties": {"Path": "$.time"}},'
' {"column":"resourceId", "Properties": {"Path":"$.resourceId"}},'
' {"column":"properties", "Properties": {"Path":"$.properties"}},'
' {"column":"category", "Properties": {"Path":"$.category"}},'
' {"column":"level", "Properties": {"Path": "$.level"}},'
' {"column":"operationName", "Properties": {"Path": "$.operationName"}},'
' {"column":"EventProcessedUtcTime", "Properties": {"Path": "$.EventProcessedUtcTime"}},'
' {"column":"PartitionId", "Properties": {"Path": "$.PartitionId"}},'
' {"column":"EventEnqueuedUtcTime", "Properties": {"Path": "$.EventEnqueuedUtcTime"}}'
']'
数据连接设置
Table: SyslogTable
Column Mapping: SyslogMapping
Data Format: Multiline Json/Json # tried with both
所以,我在这里错过了什么?
数据没有推送到 ADX 表的问题是因为我定义了 ADX 表。$Default
消费者组,而我已经在数据连接设置中使用了 $Default
消费者组,用于从EH中获取其他地方的事件。
所以解决方法很简单,就是为Event Hub创建一个新的消费者组,并创建新的数据连接。
考虑到表模式和有效载荷模式,你的摄取映射似乎没有什么问题。
例如,如果你运行这个--你会看到数据被成功摄取了
.ingest inline into table SyslogTable with(format=multijson, ingestionMappingReference='SyslogMapping') <|
{
"time": "2020-05-18T15:54:01.0000000Z",
"resourceId": "/subscriptions/xxxxx/resourceGroups/xxxx/providers/Microsoft.Compute/virtualMachines/vmname",
"properties": {
"ident": "systemd",
"Ignore": "syslog",
"Facility": "daemon",
"Severity": "info",
"EventTime": "2020-05-18T15:54:01.0000000",
"SendingHost": "localhost",
"Msg": "Removed slice User Slice of root.",
"hostname": "vmname",
"FluentdIngestTimestamp": "2020-05-18T15:54:01.0000000Z"
},
"category": "daemon",
"level": "info",
"operationName": "LinuxSyslogEvent",
"EventProcessedUtcTime": "2020-05-19T07:39:48.5220591Z",
"PartitionId": 0,
"EventEnqueuedUtcTime": "2020-05-18T15:54:05.4390000Z"
}
为了排除你所面临的问题,假设你已经保证数据成功推送到EventHub,我建议你通过azure门户为你的资源开一个支持票。