Greg Ward
Greg Ward

Reputation: 1673

Use jq to concatenate JSON arrays in multiple files

I have a series of JSON files containing an array of records, e.g.

$ cat f1.json
{
  "records": [
    {"a": 1},
    {"a": 3}
  ]
}

$ cat f2.json
{
  "records": [
    {"a": 2}
  ]
}

I want to 1) extract a single field from each record and 2) output a single array containing all the field values from all input files.

The first part is easy:

jq '.records | map(.a)' f?.json
[
  1,
  3
]
[
  2
]

But I cannot figure out how to get jq to concatenate those output arrays into a single array!

I'm not married to jq; I'll happily use another tool if necessary. But I would love to know how to do this with jq, because it's something I have been trying to figure out for years.

Upvotes: 18

Views: 17706

Answers (6)

--slurp (-s) key is needed and map() to do so in one shot

$ cat f1.json
{
  "records": [
    {"a": 1},
    {"a": 3}
  ]
}

$ cat f2.json
{
  "records": [
    {"a": 2}
  ]
}

$ jq -s 'map(.records[].a)' f?.json
[
  1,
  3,
  2
]

Upvotes: 3

peak
peak

Reputation: 116750

Assuming your jq has inputs (which is true of jq 1.5 and later), it would be most efficient to use it, e.g. along the lines of:

jq -n '[inputs.records[].a]' f*.json

Upvotes: 13

chepner
chepner

Reputation: 531205

As a compromise between the readability of --slurp and the efficiency of reduce, you can run jq twice. The first is a slightly altered version of your original command, the second slurps the undifferentiated output into a single array.

$ jq '.records[] | .a' f?.json | jq -s
[
  1,
  3,
  2
]

Upvotes: 2

Inian
Inian

Reputation: 85600

If your input files are large, slurping the file could eat up lots of memory in which you case you can reduce which works in iterative manner, appending the contents of the array .a one object at a time

jq -n 'reduce inputs.records[].a as $d (.; . += [$d])' f?.json

The -n flag is to ensure to construct the output JSON from scratch with the data available from inputs. The reduce function takes the initial value of . which because of the null input would be just null. Then for each of the input objects . += [$d] ensures that the array contents of .a are appended together.

Upvotes: 6

oliv
oliv

Reputation: 13249

Use -s (or --slurp):

jq -s 'map(.records[].a)' f?.json

Upvotes: 14

Aaron
Aaron

Reputation: 24812

You need to use --slurp so that jq will apply its filter on the aggregation of all inputs rather than on each input. When using this option, jq's input will be an array of the inputs which you need to account for.

I would use the following :

jq --slurp 'map(.records | map(.a)) | add' f?.json

We apply your current transformation to each elements of the slurped array of inputs (your previous individual inputs), then we merge those transformed arrays into one with add.

Upvotes: 4

Related Questions