xirururu
xirururu

Reputation: 5508

Debugging Hadoop remotely in eclipse

I am new in Hadoop world. I have a remote Hadoop Cluster. I want to coding in java at home with Eclipse and run the codes in the Hadoop Cluster.

I found the some similar topics with different answers, so I don't know which one is suitable. I made a conclusion, it should be two step:

  1. configue in eclipse: Debug Configurations -> Remote Java Application -> New -> input the "Host" and "Port"
  2. Call the Hadoop administrater, he should do the following for me:
    • export HADOOP_OPTS="-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5000"
    • restart hadoop

Is it so easy? Or there are some additional steps I need to do?

Upvotes: 0

Views: 552

Answers (1)

Andrew Allison
Andrew Allison

Reputation: 1136

It depends on what kind of job you would like to run (MapReduce or HDFS). You could try adding the fully-qualified path to the Hadoop cluster's configuration to the classpath of your application. The core-default-site.xml must come first, followed by the core-site.xml, and then any other additional configurations you need. See the Hadoop Javadoc for more specific details : http://hadoop.apache.org/docs/current/api/org/apache/hadoop/conf/Configuration.html . There is also the option of running the applications remotely via SSH.

Upvotes: 1

Related Questions