Ishan Kumar
Ishan Kumar

Reputation: 1982

Getting Null values in Hive table while loadind data in sparkSQL

While loading the data from file to hive tables Null values are getting inserted.

sqlCon.sql("create table hive_6(id Int,name String) partitioned by (date String) row format delimited fields terminated by ','");

sqlCon.sql("load data local inpath '/home/cloudera/file.txt' into table hive_6 partition(date='19July')");

sqlCon.sql("select * from hive_6").show()

+----+----+------+
|  id|name|  date|
+----+----+------+
|null|null|19July|
|null|null|19July|
|null|null|19July|
|null|null|19July|
|null|null|19July|
|null|null|19July|
|null|null|19July|
+----+----+------+

Upvotes: 2

Views: 5072

Answers (1)

Nandita Dwivedi
Nandita Dwivedi

Reputation: 83

I was facing the same issue when I was reading data from parquet files.

The hive queries will give the correct data, although spark-sql will show null values. The reason is schema,you should have following-

Firstly-- The column names in the file(txt/parquet) you are reading should be all in lowercase.

Secondly-- The column names in the hive table that you have created should exactly be same as that of the file you are reading.

Thirdly-- The datatypes in both txt/parquet files and hive table should be same.

Upvotes: 3

Related Questions