Reputation: 1167
Trying to write a Hive query in Scala/Spark, which looks like this
val myQuery = "create table myTable(col1 STRING, col2 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' location 'path/from/to' as select * myHiveTable"
I'm getting an error from my use of '\t'
value unary_+ is not a member of String
What's the proper way to handle this character in Scala?
Upvotes: 1
Views: 1030
Reputation: 6627
1) You should use escape slash for \t
, because you want to pass this query somewhere (to hive). If you don't add escape slash, it will save real tab whitespace into this string, so hive will not understand it.
2) You didn't provide enough details, especially code where error occurred, you only show variable assignment to hive query string.
3) where is FROM
in your sql example?
Upvotes: 0
Reputation: 16412
If you use single double quotes "
then escaped characters are interpreted according to their meaning, i.e. \t
becomes a tab and \n
becomes a new line. Example:
scala> val myQuery = "create table myTable(col1 STRING, col2 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' location 'path/from/to' as select * myHiveTable"
myQuery: String = create table myTable(col1 STRING, col2 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY ' ' location 'path/from/to' as select * myHiveTable
If you use tripple double quotes """
Scala will leave the string as it is. Example:
scala> val myQuery = """create table myTable(col1 STRING, col2 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' location 'path/from/to' as select * myHiveTable"""
myQuery: String = create table myTable(col1 STRING, col2 STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t' location 'path/from/to' as select * myHiveTable
I believe Spark expects to see \t
as text value (2 chars) rather than as a value of tab (U+0009).
Upvotes: 1