leeyuiwah
leeyuiwah

Reputation: 7152

How to express a hex literal in Spark SQL?

I am new to Spark SQL. I have searched the language manual for Hive/SparkSQL and googled for the answer, but could not find an obvious answer.

In MySQL we can express a hex literal 0xffff like this:

mysql>select 0+0xffff;
+----------+
| 0+0xffff |
+----------+
|    65535 |
+----------+
1 row in set (0.00 sec)

But in Spark SQL (I am using the beeline client), I could only do the following where the numerical values are expressed in decimal not hexidecimal.

> select 0+65535;
+--------------+--+
| (0 + 65535)  |
+--------------+--+
| 65535        |
+--------------+--+
1 row selected (0.047 seconds)

If I did the following instead, I would get an error:

> select 0+0xffff;
Error: org.apache.spark.sql.AnalysisException: 
cannot resolve '`0xffff`' given input columns: []; line 1 pos 9;
'Project [unresolvedalias((0 + '0xffff), None)]
+- OneRowRelation$ (state=,code=0)

How do we express a hex literal in Spark SQL?

Upvotes: 4

Views: 1396

Answers (1)

zavyrylin
zavyrylin

Reputation: 342

Unfortunatelly, you can't do it in Spark SQL.

You can discover it just by looking at the ANTLR grammar file. There, the number rule defined via DIGIT lexer rule which looks like this:

number
  : MINUS? DECIMAL_VALUE            #decimalLiteral
  | MINUS? INTEGER_VALUE            #integerLiteral
  | MINUS? BIGINT_LITERAL           #bigIntLiteral
  | MINUS? SMALLINT_LITERAL         #smallIntLiteral
  | MINUS? TINYINT_LITERAL          #tinyIntLiteral
  | MINUS? DOUBLE_LITERAL           #doubleLiteral
  | MINUS? BIGDECIMAL_LITERAL       #bigDecimalLiteral
  ;

...

INTEGER_VALUE
  : DIGIT+
  ;

...

fragment DIGIT
  : [0-9]
  ;

It does not include any hexadecimal characters, so you can't use them.

Upvotes: 5

Related Questions