Reputation: 1680
How to load text file which is i my local machine to a remote Hbase. I refered the above command but i am really confused with the command
hadoop jar <path to hbase jar> importtsv -Dimporttsv.columns=a,b,c '-Dimporttsv.separator=,' <tablename> <inputdir>
Where to address the path of the text file and table name and columns will be in text file. In text file i has the create and put statements, how to load and execute that file in Hbase shell. Please break my confusion if any 1 knows it.
Script File:
create 'blogpostss', 'post', 'image'
Run in HBase shell the following to add some data:
put 'blogpostss', 'post1', 'post:title', 'Hello World'
put 'blogpostss', 'post1', 'post:author', 'The Author'
put 'blogpostss', 'post1', 'post:body', 'This is a blog post'
put 'blogpostss', 'post1', 'image:header', 'image1.jpg'
put 'blogpostss', 'post1', 'image:bodyimage', 'image2.jpg'
put 'blogpostss', 'post2', 'post:title', 'Another Post'
put 'blogpostss', 'post2', 'post:title', 'My Second Post'
put 'blogpostss', 'post1', 'post:body', 'This is an updated blog postss'
Following commands retrieve data:
get 'blogpostss', 'post1'
get 'blogpostss', 'post1', { COLUMN => 'post:title' }
get 'blogpostss', 'post1', { COLUMN => 'post:title', VERSIONS => 4 }
get 'blogpostss', 'post1', { COLUMNS => 'post:body', VERSIONS => 3 }
get 'blogpostss', 'post2'
get 'blogpostss', 'post2', { COLUMN => 'post:title' }
get 'blogpostss', 'post2', { COLUMN => 'post:title', VERSIONS => 4 }
Upvotes: 0
Views: 4925
Reputation: 670
You can try this:
1) Do ssh to your hbase machine
2) copy that file to below path:
/home/hbase/hbase-0.98.3-hadoop2/bin (it can be change according to where you have hbase folder)
3)[root@hostname bin]# ./hbase shell ./sample_commands.txt
Upvotes: 0
Reputation: 757
1.Use terminal and ssh to the desired hbase configured Pc.
2.Copy your local text file to HDFS.
3.create a table with desired column family using Hbase shell.
4.Now execute your command to insert..hadoop jar <path to hbase jar> importtsv -Dimporttsv.columns=a,b,c '-Dimporttsv.separator=,' <tablename> <inputdir>
NOTE:
path to hbase jar -> Path, where the hbase jar is available.
tablename -> table name which you created now.
inputdir -> fully qualified Hdfs path with file extention.
-Dimporttsv.columns=a,b,c -> column family:qualifer has to mention.
Upvotes: 1