我的脚本创建了一个json数据集,然后尝试将其显示为表格。
在第一行代码的输出中,您可以看到示例数据集:
echo $ {conn_list [@]}} | jq'。'
{
"host": {
"name": "mike1",
"node": "c04",
"s_ip": "10.244.7.235",
"s_port": "38558",
"d_ip": "129.12.34.567",
"d_port": "22",
"pif_ip": "129.23.45.678",
"pif_port": "11019"
}
}
{
"host": {
"name": "fhlb-test",
"node": "c04",
"s_ip": "10.244.7.20",
"s_port": "49846",
"d_ip": "129.98.76.543",
"d_port": "22",
"pif_ip": "129.87.65.432",
"pif_port": "23698"
}
}
我在@tsv中使用以下jq命令来尝试构建和填充表,但遇到此错误:
echo $ {conn_list [@]}} | jq -r'[[“ NAME”,“ NODE”,“ SOURCE IP”,“ SOURCE PORT”,“ DESTINATION IP”,“ DESTINATION PORT”,“ GATEWAY IP”,“ GATEWAY PORT”],(。[],地图(length *“-”)),(。[] | [.name,.node,.s_ip,.s_port,.d_ip,.d_port,.pif_ip,.pif_port])| @tsv'
NAME NODE SOURCE IP SOURCE PORT DESTINATION IP DESTINATION PORT GATEWAY IP GATEWAY PORT
jq: error (at <stdin>:1): object ({"name":"mi...) cannot be tsv-formatted, only array
NAME NODE SOURCE IP SOURCE PORT DESTINATION IP DESTINATION PORT GATEWAY IP GATEWAY PORT
jq: error (at <stdin>:1): object ({"name":"fh...) cannot be tsv-formatted, only array
我的目标是在表中仅保留一个列标题行,而不是为每个条目保留一个列标题行,并且当然要显示数据而不是错误。 '(。[],map(length *“-”))'位用于自动生成正确的大小破折号以将列标题与数据分开。有人看到我在做什么错吗? :)
固定版本可能看起来像:
jq -rn '
# Assign the list of fields to a variable
["NAME","NODE","SOURCE IP","SOURCE PORT","DESTINATION IP","DESTINATION PORT","GATEWAY IP","GATEWAY PORT"] as $fields |
(
$fields, # emit the list as a header
($fields | map(length*"-")), # print separators below each header
(inputs | .[] | [.name, .node, .s_ip, .s_port, .d_ip, .d_port, .pif_ip, .pif_port])
) | @tsv' <<<"$s" # where s is a string with your JSON content.
...作为输出发出,作为您的输入(不重新格式化以对齐制表符):
NAME NODE SOURCE IP SOURCE PORT DESTINATION IP DESTINATION PORT GATEWAY IP GATEWAY PORT
---- ---- --------- ----------- -------------- ---------------- ---------- ------------
mike1 c04 10.244.7.235 38558 129.12.34.567 22 129.23.45.678 11019
fhlb-test c04 10.244.7.20 49846 129.98.76.543 22 129.87.65.432 23698
[直接的错误是在.[]
中包含了(.[], map(length*"-"))
-它的第一部分是没有意义的,除了插入地图内容(在TSV内容中无效,不是列表)之外,什么也没有做数据流。