shanks_roux
shanks_roux

Reputation: 438

Sqoop - Is it possible to import flat-files in HDFS

I know that it is possible to import RDBM's data to HDFS via sqoop but I would like to know if it is possible to import flat-files as well.

For example, is it possible to import a file from a remote Linux filesystem ?

Thanks for your help.

Upvotes: 2

Views: 8161

Answers (3)

Glenn
Glenn

Reputation: 1

Sqoop cannot be used to import any file type into Hadoop. Depending on your requirements for timeliness of data ingestion into hadoop (batch, near real-time, real-time) you can choose from fs -put (good for macro batches), flume or kafka (good for more frequent updates like near real-time use cases). For real-time ingestion, you may need to consider about memory first, then permanent storage second. In this case you may need to use some tools like storm or spark streaming.

Upvotes: 0

Rajashekar Reddy Peta
Rajashekar Reddy Peta

Reputation: 133

The answer is no to import flat text files using sqoop,use Flume to import

Upvotes: 0

Praveen Sripati
Praveen Sripati

Reputation: 33495

For putting flat files in HDFS, Sqoop is not required and I don't see any reason for using Sqoop for this. Just try the below command. Here is the documentation for the same.

hadoop fs -put <src-linux-file-system> <target-hdfs-file-system>

Upvotes: 1

Related Questions