ketankk
ketankk

Reputation: 2674

How to import MySQL data to Hadoop file system?

In my system I have database in Mysql. I want to import that to hadoop file system. I found something about Sqoop, but i'm not getting command to do that.

Upvotes: 0

Views: 1213

Answers (4)

ketankk
ketankk

Reputation: 2674

There are multiple ways to achieve this,

  1. The old way is to use Sqoop

  2. Another way is to use Shell Script,

    a. Connect with MySQL(mysql -h<host> -u<username> -p<password>)

    b. Open connectivity with HDFS(hadoop fs)

    c. Run SELECT on table and do put hdfs

  3. Recommended way is to use Apache Nifi

    a. Use ExecuteSQL, PutHDFS processors

Upvotes: 0

Ravindra babu
Ravindra babu

Reputation: 38950

1) Install & configure MySQL first. Create database in MySQL

2) sqoop import --connect jdbc:mysql://localhost/databasename --username $USER_NAME --password $PASSWORD$ --table tablename --m 1 command will import data.

e.g.

sqoop import --connect jdbc:mysql://localhost/testDb --username root --password hadoop123 --table student --m 1

In above command, values of various parameters database:‘testDb’ , username: ‘root’, password: ‘hadoop123’, and table student.

Have a look at this article 1 and article 2 for better understanding in step-by-step manner

Upvotes: 0

Akash Garg
Akash Garg

Reputation: 56

sqoop import --connect jdbc:mysql://mysql-server-name/db_name --username user --password password --table table_name --target-dir target_directory_name -m1

Hope it helps..

Upvotes: 2

Durga Viswanath Gadiraju
Durga Viswanath Gadiraju

Reputation: 3966

You need to install mysql jdbc/java connector and run sqoop command.

sudo yum install mysql-connector-java
ln -s /usr/share/java/mysql-connector-java.jar /var/lib/sqoop/mysql-connector-java.jar

You can run sqoop command from Save data into mysql from hive hadoop through sqoop?

Upvotes: 0

Related Questions