Reputation: 6059
hiveContext
.df
from a JDBC relational table.df
via df.registerTempTable("TESTTABLE")
.HiveThriftServer2.startWithContext(hiveContext)
.The TESTTABLE contains 1,000,000 entries, columns are ID (INT) and NAME (VARCHAR)
+-----+--------+
| ID | NAME |
+-----+--------+
| 1 | Hello |
| 2 | Hello |
| 3 | Hello |
| ... | ... |
With Beeline I access the SQL Endpoint (at port 10000) of the HiveThriftServer and perform a query. E.g.
SELECT * FROM TESTTABLE WHERE ID='3'
When I inspect the QueryLog of the DB with the SQL Statements executed I see
/*SQL #:1000000 t:657*/ SELECT \"ID\",\"NAME\" FROM test;
So there happens no predicate pushdown , as the where clause is missing.
This gives raise to the following questions:
If I create a DataFrame df
in Spark SQLContext and call
df.filter( df("ID") === 3).show()
I observe
/*SQL #:1*/SELECT \"ID\",\"NAME\" FROM test WHERE ID = 3;
as expected.
Upvotes: 2
Views: 2141
Reputation: 51
It's probably too late to answer. In this scenario, this did not work because ID is defined as Int and in the original query you are passing a string ('3'). Predicate pushdown looks for the same column name and the type as well.
Upvotes: 1