Abhishek Anand
Abhishek Anand

Reputation: 181

External Table not getting updated from parquet files written by spark streaming

I am using spark streaming to write the aggregated output as parquet files to the hdfs using SaveMode.Append. I have an external table created like :

CREATE TABLE if not exists rolluptable
USING org.apache.spark.sql.parquet
OPTIONS (
  path "hdfs:////"
);

I had an impression that in case of external table the queries should fetch the data from newly parquet added files also. But, seems like the newly written files are not being picked up.

Dropping and recreating the table every time works fine but not a solution.

Please suggest how can my table have the data from newer files also.

Upvotes: 4

Views: 6633

Answers (1)

lev
lev

Reputation: 4127

Are you reading those tables with spark? if so, spark caches parquet tables metadata (since schema discovery can be expensive)

To overcome this, you have 2 options:

  1. Set the config spark.sql.parquet.cacheMetadata to false
  2. refresh the table before the query: sqlContext.refreshTable("my_table")

See here for more details: http://spark.apache.org/docs/latest/sql-programming-guide.html#hive-metastore-parquet-table-conversion

Upvotes: 7

Related Questions