user2841795
user2841795

Reputation: 405

Convert string type to array type in spark sql

I have table in Spark SQL in Databricks and I have a column as string. I converted as new columns as Array datatype but they still as one string. Datatype is array type in table schema

Column as String

Data1 
[2461][2639][2639][7700][7700][3953]

Converted to Array

Data_New
["[2461][2639][2639][7700][7700][3953]"]

String to array conversion

df_new = df.withColumn("Data_New", array(df["Data1"])) 

Then write as parquet and use as spark sql table in databricks

When I search for string using array_contains function I get results as false

select *
from table_name
where array_contains(Data_New,"[2461]")

When I search for all string then query turns the results as true

Please suggest if I can separate these string as array and can find any array using array_contains function.

Upvotes: 2

Views: 8900

Answers (3)

E.ZY.
E.ZY.

Reputation: 725

import

from pyspark.sql import functions as sf, types as st

create table

a = [["[2461][2639][2639][7700][7700][3953]"], [None]]
sdf = sc.parallelize(a).toDF(["col1"])
sdf.show()
+--------------------+
|                col1|
+--------------------+
|[2461][2639][2639...|
|                null|
+--------------------+

convert type

def spliter(x):
    if x is not None:
        return x[1:-1].split("][")
    else:
        return None
udf = sf.udf(spliter, st.ArrayType(st.StringType()))
sdf.withColumn("array_col1", udf("col1")).withColumn("check", sf.array_contains("array_col1", "2461")).show()
+--------------------+--------------------+-----+
|                col1|          array_col1|check|
+--------------------+--------------------+-----+
|[2461][2639][2639...|[2461, 2639, 2639...| true|
|                null|                null| null|
+--------------------+--------------------+-----+

Upvotes: -1

blackbishop
blackbishop

Reputation: 32640

Just remove leading and trailing brackets from the string then split by ][ to get an array of strings:

df = df.withColumn("Data_New", split(expr("rtrim(']', ltrim('[', Data1))"), "\\]\\["))
df.show(truncate=False)

+------------------------------------+------------------------------------+
|Data1                               |Data_New                            |
+------------------------------------+------------------------------------+
|[2461][2639][2639][7700][7700][3953]|[2461, 2639, 2639, 7700, 7700, 3953]|
+------------------------------------+------------------------------------+

Now use array_contains like this:

df.createOrReplaceTempView("table_name")

sql_query = "select * from table_name where array_contains(Data_New,'2461')"
spark.sql(sql_query).show(truncate=False)

Upvotes: 2

LaSul
LaSul

Reputation: 2411

Actually this is not an array, this is a full string so you need a regex or similar

expr = "[2461]"
df_new.filter(df_new["Data_New"].rlike(expr))

Upvotes: 0

Related Questions