Reputation: 875
I have sent logs from kubernetes to an S3 bucket and want to query it using Athena
The log looks like this
[{ "date":1589895855.077230,
"log":"192.168.85.35 - - [19/May/2020:13:44:15 +0000] \"GET /healthz HTTP/1.1\" 200 3284 \"-\" \"ELB-HealthChecker/2.0\" \"-\"",
"stream":"stdout",
"time":"2020-05-19T13:44:15.077230187Z",
"kubernetes":{
"pod_name":"myapp-deployment-cd984ffb-kjfbm",
"namespace_name":"master",
"pod_id":"eace0175-99cd-11ea-95e4-0aee746ae5d6",
"labels":{
"app":"myapp",
"pod-template-hash":"cd984ffb"
},
"annotations":{
"cluster-autoscaler.kubernetes.io/safe-to-evict":"false",
"kubernetes.io/psp":"eks.privileged"
},
"host":"ip-1-1-1-1.eu-north-1.compute.internal",
"container_name":"myapp",
"docker_id":"cb2cda1ed46c5f09d15090fc3f654b1de35970001e366923287cfbd4a4abf4a1"
}
},
{ "date":1589995860.077230,
"log":"192.168.1.40 - - [20/May/2020:17:31:00 +0000] \"GET /healthz HTTP/1.1\" 200 3284 \"-\" \"ELB-HealthChecker/2.0\" \"-\"",
"stream":"stdout",
"time":"2020-05-20T17:31:00.077230187Z",
"kubernetes":{
"pod_name":"myapp-deployment-cd984ffb-kjfbm",
"namespace_name":"master",
"pod_id":"eace0175-99cd-11ea-95e4-0aee746ae5d6",
"labels":{
"app":"myapp",
"pod-template-hash":"cd984ffb"
},
"annotations":{
"cluster-autoscaler.kubernetes.io/safe-to-evict":"false",
"kubernetes.io/psp":"eks.privileged"
},
"host":"ip-1-1-1-1.eu-north-1.compute.internal",
"container_name":"myapp",
"docker_id":"cb2cda1ed46c5f09d15090fc3f654b1de35970001e366923287cfbd4a4abf4a1"
}
},]
So an array with json object in it basically.
I am using an CREATE EXTERNAL TABLE query in Athena to create the table. What I have tried is:
CREATE EXTERNAL TABLE IF NOT EXISTS athenadb.mytable (
`data` string
)
ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
LOCATION 's3://mybucket/testlog'
TBLPROPERTIES ('has_encrypted_data'='false');
This only read the first item in the array into the table, unless I specified more rows such as
data1 string
data2 string
data3 string
However since I don't know how many items is in the array I need something more dynamic.
Then I tried this
CREATE EXTERNAL TABLE IF NOT EXISTS athenadb.mytable (
`data` string
)
LOCATION 's3://mybucket/testlog'
TBLPROPERTIES ('has_encrypted_data'='false');
Now I get the entire log (both entries) in one row in the table.
From here I have tried to use UNNEST but I get errors that "cannot unnest type: varchar"
What would be the simplest way to get each {} into its own row in the table? Maybe done from the CREATE EXTERNAL TABLE without needing any extra queries afterwards?
Edit:
Tried this as well now
SELECT data
FROM mytable
CROSS JOIN UNNEST(CAST(json_parse(data) AS array)) AS data2
But I get "Unknown type: array"
I found a similar question here: How do I import an array of data into separate rows in a hive table?
But there didn't seem to be any suggested solution that created the wanted result.
Upvotes: 5
Views: 18810
Reputation: 20710
Combine unnest
with casting json
to array(json)
:
SELECT data, e
FROM mytable
CROSS JOIN UNNEST(CAST(json_parse(data) AS array(json))) t(e)
Note: array<json>
is a legacy version of array(json)
type definition. The latter is SQL standard compliant.
Upvotes: 8