Reputation: 2092
I am having problems with my json to csv conversion.
I have json file of this structure:
{
"key": [
{
"key1": 1,
"key2": 1,
"key3": {
"1": 1,
"2": 2,
"3": 3,
"4": 4
}
},
{
"key1": 2,
"key2": 2,
"key3": {
"2": 2
}
}
...
],
...
}
I was using this jq call to convert my json to csv:
bin\jq-win64 ".key[] | [.key1, .key2, .key3.\"1\", .key3.\"2\", .key3.\"3\", .key3.\"4\" ] | tostring] | join(\";\")" source.json > output.tmp
I can't use standard @csv
, because it's not good for my locale settings. But back to the problem. In some cases key3
might not be full 4 element object (4 is the max with keys 1/2/3/4, just like in example). I have a problem with those missing subkeys, because jq returns "null" and that does not work well with CSV evaluation in excel or calc. Is there a way to force empty string or numeric 0 as output in such case?
In the end, I can try using some other command line text processor, but I'd be glad if I could do that with single tool.
EDIT: I had the wrong json structure example and now the actual problem has changed a bit. Message is updated.
Upvotes: 0
Views: 3767
Reputation: 117027
You could tweak your pipeline by adding
map(. // 0)
right after forming the array. If you want to preserve false
, then you would have to add
map(if . == null then 0 else . end)
instead.
If you wanted a solution that was agnostic about the key names, you could use something along the lines of:
def resize($n): [range(0;$n) as $i | .[$i] // 0];
This would truncate or expand the input array. If you don’t ever want to truncate, then tweak accordingly.
Upvotes: 4