Reputation: 165
Facing below the error when trying to access some table but some is working properly what can be the reason.
Query:
SELECT * FROM icbs_cup00901_d LIMIT 1
Same Query is working for other tables so I am confuse what can be the reason and mentioned table in query has 8 columns with string data type.
Error: Your query has the following error(s):
SYNTAX_ERROR: line 1:8: SELECT * not allowed from relation that has no columns
This query ran against the "bpd_ds_db" database, unless qualified by the query. Please post the error message on our forum or contact customer support with Query Id: a462d56e-952b-4a6f-a275-b0c9db0301b0.
I have checked that we have 8 columns in same table with string data type and the exact same query works for some tables so I am confused what's the issue may be
CREATE EXTERNAL TABLE icbs_cup00901_d(
cux1ac_acct_surr_id string,
cux1ap string,
cux1cs_cust_surr_id string,
cux1ty string,
cuxbk string,
cuxnot string,
cuxrec string,
cuxrel string
)
ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe' STORED AS
INPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat'
LOCATION 's3://bpd-datalake/00Raw/datasets/parquet/DWH/ICBS_CUP00901_D'
TBLPROPERTIES (
'lastUpdateDate'='2021-09-30 03:02:04.780351',
'last_modified_by'='hadoop',
'last_modified_time'='1632970985',
'parquet.compress'='GZIP', 'transient_lastDdlTime'='1632970985'
)
Upvotes: 9
Views: 19743
Reputation: 328
In my case i used DynamoDB Connector. The tables were encrypted with customer managed encryption key. And queries returned the same error message.
So i just needed to extend the Connector Lambda permission with the kms:Decrypt to be able to query the tables
Upvotes: 0
Reputation: 2080
In my use-case, it was not enough to add "Describe" & "Select" in Data Lake Permissions, I also had to go to Administration > Data lake locations, and register the appropriate S3 URI.
Note that if you just register a bucket name, you can still access a database that users that bucket name slash folder...
Tip: Select "ALL TABLES" to force AWS Lake Formation to explicitly add policy. [![enter image description here][1]][1]
Also, deregistering an S3 resource may not cleanup the AWS Managed Policy called "AWSServiceRoleForLakeFormationDataAccess" [1]: https://i.sstatic.net/itfK5.png
Upvotes: 0
Reputation: 11
For this issue, you need to grant the select access in the Lake Formation. Select the role you are using and grant at least the select access,
Upvotes: 1
Reputation: 165
Thanks everyone to share the information! In my case I am not sure whether it was related to permission or not but yes, issue was in the '*' when I replaced with required column names then it was working properly
Upvotes: 1
Reputation: 376
I had the same issue, found out it was because my user did not have sufficient permissions. The error is not very helpful, but I suppose it cannot see the columns, so gives that error.
Resolve it by adding the permissions in AWS Lake Formation. Under Data Lake Permissions, make sure your user has atleast the 'Describe' and 'Select' permissions on the database and the table.
Upvotes: 19