Arvinth
Arvinth

Reputation: 70

Between statement is not working on Hive Map column - Spark SQL

I am have following hive table. key column has map value(key-value pairs). I am executing spark sql query with between statement on key column, but it is returning null records.

+---------------+--------------+----------------------+---------+
| column_value  | metric_name  |         key          |key[0]   |
+---------------+--------------+----------------------+---------+
| A37B          | Mean         | {0:"202009",1:"12"}  | 202009  |
| ACCOUNT_ID    | Mean         | {0:"202009",1:"12"}  | 202009  |
| ANB_200       | Mean         | {0:"202009",1:"12"}  | 202009  |
| ANB_201       | Mean         | {0:"202009",1:"12"}  | 202009  |
| AS82_RE       | Mean         | {0:"202009",1:"12"}  | 202009  |
| ATTR001       | Mean         | {0:"202009",1:"12"}  | 202009  |
| ATTR001_RE    | Mean         | {0:"202009",1:"12"}  | 202009  |
| ATTR002       | Mean         | {0:"202009",1:"12"}  | 202009  |
| ATTR002_RE    | Mean         | {0:"202009",1:"12"}  | 202009  |
| ATTR003       | Mean         | {0:"202009",1:"12"}  | 202009  |
| ATTR004       | Mean         | {0:"202009",1:"12"}  | 202009  |
| ATTR005       | Mean         | {0:"202009",1:"12"}  | 202009  |
| ATTR006       | Mean         | {0:"202009",1:"12"}  | 202008  |

I am running below spark sql query

SELECT column_value, metric_name,key FROM table where metric_name = 'Mean' and column_value IN ('ATTR003','ATTR004','ATTR005') and key[0] between 202009 and 202003

Query is not returning any records. Instead of between statement, if i use IN (202009,202007,202008,202006,202005,202004,202003) statement it is returning result.

Need help!

Upvotes: 0

Views: 47

Answers (1)

Ged
Ged

Reputation: 18053

Try other way around between values. E.g. between 202003 and 202009.

Upvotes: 2

Related Questions