Naveen Srikanth
Naveen Srikanth

Reputation: 789

Pyspark writing data into hive

Below is my code to write data into Hive

from pyspark import since,SparkContext as sc
from pyspark.sql import SparkSession
from pyspark.sql.functions import _functions , isnan
from pyspark.sql import SQLContext
from pyspark.sql.types import *
from pyspark import HiveContext as hc


spark = SparkSession.builder.appName("example-spark").config("spark.sql.crossJoin.enabled","true").config('spark.sql.warehouse.dir',"file:///C:/spark-2.0.0-bin-hadoop2.7/bin/metastore_db/spark-warehouse").config('spark.rpc.message.maxSize','1536').getOrCreate()
Name= spark.read.csv("file:///D:/valid.csv", header="true",inferSchema = 
True,sep=',')

join_df=join_df.where("LastName != ''").show()  
join_df.registerTempTable("test")
hc.sql("CREATE TABLE dev_party_tgt_repl STORED AS PARQUETFILE AS SELECT * from dev_party_tgt")

After executing the above code I get below error

Traceback (most recent call last):
File "D:\01 Delivery Support\01 
easyJet\SparkEclipseWorkspace\SparkTestPrograms\src\NameValidation.py", line 
22, in <module>
join_df.registerTempTable("test")
AttributeError: 'NoneType' object has no attribute 'test'

My System Environment details:

Upvotes: 0

Views: 6110

Answers (1)

Piotr Kalański
Piotr Kalański

Reputation: 689

Try this:

join_df.where("LastName != ''").write.saveAsTable("dev_party_tgt_repl")

Upvotes: 1

Related Questions