amikm
amikm

Reputation: 65

How to fix the SQL query in databricks if column name has bracket in it

I have a file which has data like this , I have converted that file into a databricks table.

Select * from myTable

Output:

Product[key] Product[name]
123          Mobile
345          television
456          laptop

I want to query my table for laptop data.

I am using below query

Select * from myTable where Product[name]='laptop'

I am getting below error in databricks:

AnalysisException: cannot resolve 'Product' given input columns: [spark_catalog.my_db.myTable.Product[key],[spark_catalog.my_db.myTable.Product[name]

Upvotes: 0

Views: 6806

Answers (1)

Saideep Arikontham
Saideep Arikontham

Reputation: 6114

When certain characters appear in column names of a table in SQL, you get a parse exception. These characters include brackets, dots (.), hyphens (-), etc. So, when such characters appear in column names, we need an escape character to parse these characters just as a part of column name.

For SQL in Databricks, this character is Backtick (`). Enclosing your column name in backticks ensures that your column name is parsed correctly as it is even when it includes characters like ‘[]’ (In this case).

Since you have converted a file data into Databricks table, you were not able to see the main problem which is parsing the column name. If you manually create a table with specified schema in Databricks, you will get the following result:

enter image description here

Once you use Backtick in the following way, using the column name would not be a problem anymore.

 create table mytable(`Product[key]` integer, `Product[name]` varchar(20)) 
 select * from mytable where `Product[name]`='laptop' 

Upvotes: 1

Related Questions