我正在尝试将来自事件中心的数据以SAQL
格式写入json
。
azure Stream Analytics
作业的输入如下所示。
{"ver":"2019-12-28 18:41:45.4184730","Data":"Data01","d":{"IDNUM":"XXXXX01","Time1":"2017-12-20T00:00:00.0000000Z","abc":"610000","efg":"0000","XYZ":"00000","ver":"2017-12-20T18:41:45.4184730Z"}}
{"ver":"2019-12-28 18:41:45.4184730","Data":"Data01","d":{"IDNUM":"XXXXX02","Time1":"2017-12-20T00:00:00.0000000Z","abc":"750000","efg":"0000","XYZ":"90000","ver":"2017-12-20T18:41:45.4184730Z"}}
{"ver":"2017-01-01 06:28:52.5041237","Data":"Data02","d":{"IDNUM":"XXXXX03","acc":-10.7000,"PQR":35.420639038085938,"XYZ":139.95817565917969,"ver":"2017-01-01T06:28:52.5041237Z"}}
{"ver":"2017-01-01 06:28:52.5041237","Data":"Data02","d":{"IDNUM":"XXXXX04","acc":-8.5999,"PQR":35.924240112304688,"XYZ":139.6097412109375,"ver":"2017-01-01T06:28:52.5041237Z"}}
在前两行中,属性Time1可用,而在后两行中不存在Time1属性本身。
我必须根据输入数据中的Time1属性将数据存储到cosmos DB中。
Path in json data >>> input.d.Time1.
我必须将具有Time1的数据存储到cosmosDB
容器中,将不具有Time1的数据存储到另一个容器中。
我尝试使用下面的SAQL
。
SELECT [input].ver,
[input].Data,
d.*
INTO [cosmosDB01]
FROM [input] PARTITION BY PartitionId
WHERE [input].Data is not null
AND [input].d.Time1 is not null
SELECT [input].ver,
[input].Data,
d.*
INTO [cosmosDB01]
FROM [input] PARTITION BY PartitionId
WHERE [input].Data is not null
AND [input].d.Time1 is null
[EXISTS
查询中还有IS stream analytics
关键字吗?
据我所知,到目前为止,ASA中没有is_exists
或is_defined
sql内置关键字。您必须按照问题中提到的方式处理多个输出方案。
(类似情况:Azure Stream Analytics How to handle multiple output table?)
当然,您可以将反馈提交给ASA团队以推动ASA的进步。