Reputation: 301
So I am just starting out with HIVE,
Here is what I do,
-> Load the file into HDFS:
hadoop fs -put purchases.txt
-> Create a table:
> CREATE EXTERNAL TABLE p1(transaction STRING) STORED AS TEXTFILE
> LOCATION '/purchases.txt';
or
CREATE TABLE p1(transaction STRING) STORED AS TEXTFILE LOCATION '/purchases.txt';
-> Show the table:
show tables;
At this point it shows me the table p1 has been created.
-> Viewing Contents
select * from p1
It just outputs
OK
Time taken: 0.175 seconds
EDIT:
The data is stored in this format:
date '\t' time '\t' store '\t' item '\t' cost
I would like to take the whole line as a single string and hence i've specified only one string column.
Upvotes: 0
Views: 1898
Reputation: 31
Use this command instead; should work:
CREATE EXTERNAL TABLE p1(transaction STRING)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'
STORED AS TEXTFILE LOCATION '/p1'
Upvotes: 0
Reputation: 18434
The location of a Hive table should be specified as a directory, not an individual file. Hive will then read every file in that directory. For example:
create the directory:
hadoop fs -mkdir /p1
put the file in the directory:
hadoop fs -put purchases.txt /p1
create the hive table:
CREATE EXTERNAL TABLE p1(transaction STRING)
STORED AS TEXTFILE
LOCATION '/p1';
Most tools in the hadoop world tend to operate on directories instead of individual files. That way, hadoop itself can manage how many files are read/written and what they are named.
Upvotes: 2
Reputation: 955
Try using following query you will be able to see the data populate in your table:
create external table p1 (
transaction String
)
location '/purchases.txt';
NOTE: There are several other ways in which you can do table creation and load data into the table. I had just specify the solution for approach you took. Try exploring them.
Let me know if you have any questions.
Happy Hadooping!!!!!
Upvotes: 0