Girish Bhat M
Girish Bhat M

Reputation: 392

Spark sql query giving data type miss match error

I have small sql query which working perfectly fine in sql, but the same query working in hive as expected. Table has user information and below is the query

spark.sql("select * from users where (id,id_proof) not in ((1232,345))").show;

I am getting below exception in spark

org.apache.spark.sql.AnalysisException: cannot resolve '(named_struct('age', deleted_inventory.`age`, 'id_proof', deleted_inventory.`id_proof`) IN (named_struct('col1',1232, 'col2', 345)))' due to data type mismatch: Arguments must be same type but were: StructType(StructField(id,IntegerType,true), StructField(id_proof,IntegerType,true)) != StructType(StructField(col1,IntegerType,false), StructField(col2,IntegerType,false));

I id and id_proof are of integer types.

Upvotes: 0

Views: 1149

Answers (1)

stack0114106
stack0114106

Reputation: 8711

Try using the with() table, it works.

scala> val df = Seq((101,121), (1232,345),(222,2242)).toDF("id","id_proof")
df: org.apache.spark.sql.DataFrame = [id: int, id_proof: int]

scala> df.show(false)
+----+--------+
|id  |id_proof|
+----+--------+
|101 |121     |
|1232|345     |
|222 |2242    |
+----+--------+


scala> df.createOrReplaceTempView("girish")

scala> spark.sql("with t1( select 1232 id,345 id_proof ) select id, id_proof from girish where (id,id_proof) not in (select id,id_proof from t1) ").show(false)
+---+--------+
|id |id_proof|
+---+--------+
|101|121     |
|222|2242    |
+---+--------+


scala>

Upvotes: 1

Related Questions