user3822232
user3822232

Reputation: 315

Flink sql : how to find the size of json array

I am using flink sql to read data from kafka. One field in kafka message is array, example

{
  "description": "som description",
  "owner": {
    "type": "some",
    "id": "5ff4eb4fed9b4b1288d7993944a8ca23"
  },
  "someArray": [
    {
      "type": "foo",
      "id": "c31a2d10134146e29726fb87246b68d0"
    },
    {
      "type": "foo1",
      "id": "c31a2d10134146e29726fb87246b68d0"
    }
  ]
}

i want to write a select statement similar to select description, size_of(someArray) from some_table;

Flink does NOT have size_of function. Can I get the length of someArray which is 2 in this example, using some built-in functions ?

I have tried to write an UDF for this, challenge i have with udf is when the query are executed with sql-gateway i get class not found exception on the UDF class ( its a java class ). When i try the same with cli sql-client i can get the udf to work. I have added the jar that contains the udf to /usr/local/Cellar/apache-flink/1.16.0/libexec/lib when running it on my machine

Upvotes: 1

Views: 749

Answers (1)

xjmdoo
xjmdoo

Reputation: 1736

You can use the built-in system function CARDINALITY to get the length of an array like so:

select cardinality(someArray) as array_length...;

For more information about collection functions in the table API, please check the docs.

Upvotes: 1

Related Questions