user1219626
user1219626

Reputation:

How to access my HDFS filesystem from another machine?

I am running programme which create hdfs directory and put file into it . In java programme i am using congiuraion like this.

Configuration conf = new Configuration();
conf.set("fs.default.name","hdfs://localhost:9000");
conf.set("mapred.job.tracker","localhost:8021");

But now my collegue from another machine want to copy the file present in my HDFS. For That i am sure He has to connect to my HDFS. So how my collegue can connect to my HDFS and copy file from it.

My collegue using below code to access my HDFS.

Configuration conf = new Configuration();
conf.set("fs.default.name","hdfs://192.168.1.239:9000");
conf.set("mapred.job.tracker","192.168.1.239:8021");

but it is not working giving following error

14/11/03 16:17:22 INFO ipc.Client: Retrying connect to server: 192.168.1.239/192.168.1.239:9000. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/11/03 16:17:23 INFO ipc.Client: Retrying connect to server: 192.168.1.239/192.168.1.239:9000. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/11/03 16:17:24 INFO ipc.Client: Retrying connect to server: 192.168.1.239/192.168.1.239:9000. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/11/03 16:17:25 INFO ipc.Client: Retrying connect to server: 192.168.1.239/192.168.1.239:9000. Already tried 3 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/11/03 16:17:26 INFO ipc.Client: Retrying connect to server: 192.168.1.239/192.168.1.239:9000. Already tried 4 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/11/03 16:17:27 INFO ipc.Client: Retrying connect to server: 192.168.1.239/192.168.1.239:9000. Already tried 5 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/11/03 16:17:28 INFO ipc.Client: Retrying connect to server: 192.168.1.239/192.168.1.239:9000. Already tried 6 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/11/03 16:17:29 INFO ipc.Client: Retrying connect to server: 192.168.1.239/192.168.1.239:9000. Already tried 7 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/11/03 16:17:30 INFO ipc.Client: Retrying connect to server: 192.168.1.239/192.168.1.239:9000. Already tried 8 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
14/11/03 16:17:31 INFO ipc.Client: Retrying connect to server: 192.168.1.239/192.168.1.239:9000. Already tried 9 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1 SECONDS)
Exception in thread "main" java.net.ConnectException: Call to 192.168.1.239/192.168.1.239:9000 failed on connection exception: java.net.ConnectException: Connection refused
    at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
    at org.apache.hadoop.ipc.Client.call(Client.java:1118)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
    at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
    at com.sun.proxy.$Proxy1.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
    at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:124)
    at com.volcareTest.VolcareTest.VolcareApp.main(VolcareApp.java:27)
Caused by: java.net.ConnectException: Connection refused
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:511)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:481)
    at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:457)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:583)
    at org.apache.hadoop.ipc.Client$Connection.access$2200(Client.java:205)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1249)
    at org.apache.hadoop.ipc.Client.call(Client.java:1093)
    ... 20 more

If my collegue approach is wrong then what is the correct answer.

Upvotes: 0

Views: 3036

Answers (2)

user1219626
user1219626

Reputation:

I resolved my problem , I just changed in my core-site.xml configuration file instead of localhost:9000, i changed it to 192.168.1.239 for fs.default.name property and same with my java code too , now it works

Upvotes: 0

Mr.Chowdary
Mr.Chowdary

Reputation: 3407

If both manchines are in same network, then

Configuration conf = new Configuration();
conf.set("fs.default.name","hdfs://192.168.1.239:9000");
conf.set("mapred.job.tracker","192.168.1.239:8021");  

This must work, If both the machines are in different and then having connected to internet, you can find the public ipaddress of the machine to get connected by ipaddress finder

Hope it helps you.

Upvotes: 1

Related Questions