Answers:
只需通过管道即可keys
运行:
样品input.json
:
{
"connections": {
"host1": { "ip": "10.1.2.3" },
"host2": { "ip": "10.1.2.2" },
"host3": { "ip": "10.1.18.1" }
}
}
jq -r '.connections | keys[] as $k | "\($k), \(.[$k] | .ip)"' input.json
输出:
host1, 10.1.2.3
host2, 10.1.2.2
host3, 10.1.18.1
keys
按排序顺序生成键名;如果要按原始顺序键名,请使用keys_unsorted
”。因此,OP意识到了这一点,并进行了有keys
意识的选择。
一个更通用的bash函数来导出vars(带插值):
#
#------------------------------------------------------------------------------
# usage example:
# doExportJsonSectionVars cnf/env/dev.env.json '.env.virtual.docker.spark_base'
#------------------------------------------------------------------------------
doExportJsonSectionVars(){
json_file="$1"
shift 1;
test -f "$json_file" || echo "the json_file: $json_file does not exist !!! Nothing to do" && exit 1
section="$1"
test -z "$section" && echo "the section in doExportJsonSectionVars is empty !!! nothing to do !!!" && exit 1
shift 1;
while read -r l ; do
eval $l ;
done < <(cat "$json_file"| jq -r "$section"'|keys_unsorted[] as $key|"export \($key)=\(.[$key])"')
}
示例数据
cat cnf/env/dev.env.json
{
"env": {
"ENV_TYPE": "dev",
"physical": {
"var_name": "var_value"
},
"virtual": {
"docker": {
"spark_base": {
"SPARK_HOME": "/opt/spark"
, "SPARK_CONF": "$SPARK_HOME/conf"
}
, "spark_master": {
"var_name": "var_value"
}
, "spark_worker": {
"var_name": "var_value"
}
}
, "var_name": "var_value"
}
}
}
keys
对键进行排序,因此值得指出的keys_unsorted
是不然。