m81
m81

Reputation: 2327

Write data from Hadoop MapReduce job into MySQL

I've been parsing log files using MapReduce, but it always outputs a text file named "part-00000" to store my results, and I have to then import part--00000 into mysql manually.

Is there an easy way to store MapReduce results directly in MySQL? For example, how might I store the results of the classic "Word Count" MapReduce program in MySQL directly?

I'm using Hadoop 1.2.1, and the mapred libraries (i.e. org.apache.hadoop.mapred.* instead of org.apache.hadoop.mapreduce.*, and the two are not compatible as far as I'm aware.) I don't have access to Sqoop.

Upvotes: 2

Views: 2247

Answers (1)

NPKR
NPKR

Reputation: 5506

By using DBOutputFormat, we can write MapReduce output to direct databases.

Here is some example, go through this.

Personally i suggest Sqoop for Data imports (from DB to HDFS) and exports (From hdfs to DB).

Upvotes: 1

Related Questions