user1733735
user1733735

Reputation: 453

Unable to read parquet file locally in spark

I am running Pyspark locally and trying to read a parquet file and load into a data frame from notebook.

df = spark.read.parquet("metastore_db/tmp/userdata1.parquet")

I am getting this exception

An error occurred while calling o738.parquet.
: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient;

Does anyone know how to do it?

Upvotes: 0

Views: 3306

Answers (1)

Vijay Krishna
Vijay Krishna

Reputation: 1067

Assuming that you are running spark on your local, you should be doing something like

df = spark.read.parquet("file:///metastore_db/tmp/userdata1.parquet")

Upvotes: 1

Related Questions