ashishpm
ashishpm

Reputation: 491

Using jq how to pass multiple values as arguments to a function?

I have a json file test.json with the content:

[
{
  "name": "Akshay",
  "id": "234"
},
{
  "name": "Amit",
  "id": "28"
}
]

I have a shell script with content:

#!/bin/bash
function display
{
  echo "name is $1 and id is $2"
}

cat test.json | jq '.[].name,.[].id' | while read line; do display $line; done

I want name and id of a single item to be passed together as arguments to the function display but the output is something like this :

name is "Akshay" and id is 
name is "Amit" and id is   
name is "234" and id is 
name is "28" and id is 

What should be the correct way to implement the code? PS: I specifically want to use jq so please base the answer in terms of jq

Upvotes: 0

Views: 1510

Answers (2)

Kapooky Handy
Kapooky Handy

Reputation: 111

You can use string interpolation.

jq '.[] | "The name is \(.name) and id \(.id)"'

Result:

"The name is Akshay and id 234"
"The name is Amit and id 28"
"The name is hi and id 28"

If you want to get rid of the double-quotes from each object, then:

jq --raw-output '.[] | "The name is \(.name) and is \(.id)"'

https://jqplay.org/s/-lkpHROTBk0

Upvotes: 1

Charles Duffy
Charles Duffy

Reputation: 295629

Two major issues, and some additional items that may not matter for your current example use case but can be important when you're dealing with real-world data from untrusted sources:

  • Your current code iterates over all names before writing any ids.
  • Your current code uses newline separators, but doesn't make any effort to read multiple lines into each while loop iteration.
  • Your code uses newline separators, but newlines can be present inside strings; consequently, this is constraining the input domain.
  • When you pipe into a while loop, that loop is run in a subshell; when the pipeline exits, the subshell does too, so any variables set by the loop are lost.
  • Starting up a copy of /bin/cat and making jq read a pipe from its output is silly and inefficient compared to letting jq read from test.json directly.

We can fix all of those:

  • To write names and ids in pairs, you'd want something more like jq '.[] | (.name, .id)'
  • To read both a name and an id for each element of the loop, you'd want while IFS= read -r name && IFS= read -r id; do ... to iterate over those pairs.
  • To switch from newlines to NULs (the NUL being the only character that can't exist in a C string, or thus a bash string), you'd want to use the -j argument to jq, and then add explicit "\u0000" elements to the content being written. To read this NUL-delimited content on the bash side, you'd need to add the -d '' argument to each read.
  • To move the while read loop out of the subshell, we can use process substitution, as described in BashFAQ #24.
  • To let jq read directly from test.json, use either <test.json to have the shell connect the file directly to jq's stdin, or pass the filename on jq's command line.

Doing everything described above in a manner robust against input data containing JSON-encoded NULs would look like the following:

#!/bin/bash
display() {
  echo "name is $1 and id is $2"
}

cat >test.json <<'EOF'
[
  { "name": "Akshay", "id": "234" },
  { "name": "Amit", "id": "28" }
]
EOF

while IFS= read -r -d '' name && IFS= read -r -d '' id; do
  display "$name" "$id"
done < <(jq -j '
  def stripnuls: sub("\u0000"; "<NUL>");
  .[] | ((.name | stripnuls), "\u0000", (.id | stripnuls), "\u0000")
' <test.json)

You can see the above running at https://replit.com/@CharlesDuffy2/BelovedForestgreenUnits#main.sh

Upvotes: 1

Related Questions