xemjas
xemjas

Reputation: 135

Accessing HDFS Remotedly

I have hadoop server runs on certain server, let's say on IP 192.168.11.7 and have its core-site.xml as follows :

<configuration>
<property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
</property>

I already run my hdfs, i.e. with command :

sbin/start-dfs.sh

Now, I want to access the HDFS from my local computer with browser. Is it possible?

I tried http://192.168.11.7:9000 or http://192.168.11.7:50075, but no avail. i.e. This site can’t be reached

Thank you very much

Edited :

This is the content of my hdfs-site.xml :

<configuration>
<property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
<property>
    <name>dfs.webhdfs.enabled</name>
    <value>true</value>
</property>
<property>
    <name>dfs.namenode.http-address</name>
    <value>0.0.0.0:50070</value>
</property>

and my core-site.xml :

<configuration>
<property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
</property>

<property>
        <name>dfs.datanode.http.address</name>
        <value>0.0.0.0:50075</value>
</property>

but when I access it from ly local computer : http://192.168.11.7:50075, still no avail Is there something wrong?

Thank you

Upvotes: 2

Views: 1343

Answers (3)

Nikhil Vandanapu
Nikhil Vandanapu

Reputation: 529

(Question edited now)

Try accessing 50070 port which is the default value for dfs.http.address. Here is where you can access the namenode's web interface. That should work. If it does, you can append /explorer.html#/ to the URL before (i.e. http://192.168.11.7:50070/explorer.html#) and you should be able to browse the file system from there.

Refer this SO Answer to see the default values for various ports of various properties.

Or wade through the output of hadoop org.apache.hadoop.conf.Configuration in your server to see if the value of dfs.http.address has been changed if http://192.168.11.7:50070/ doesn't work from your browser.

So basically:

  • Check if this works http://192.168.11.7:50070 works, if it works go to http://192.168.11.7:50070/explorer.html# to access the file system
  • Else, go through the output of hadoop org.apache.hadoop.conf.Configuration to see if this value dfs.http.address has been changed. It probably will be there.

[OLD ANSWER]

What you are looking for is an SSH connection to your remote server. I believe that this nice tutorial will help you achieve what you are looking for.

It's unlikely that you are using a Windows server, but if you are, I believe this'll help you out.

When you do this you get terminal access to your remote server.

If you are looking for browser access, you may try something similar to what is listed here and here.

Upvotes: 1

ozw1z5rd
ozw1z5rd

Reputation: 3208

Please note:

<property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
</property>

is not for users and their browsers. This value is read by Java programs when need to access HDFS. What you are looking is this key:

<property>
        <name>dfs.datanode.http.address</name>
        <value>0.0.0.0:50075</value>
</property>

This is where the datanode exposes its status. To surf into HDFS using a web browser you need to activate webhdfs.

<property>
  <name>dfs.webhdfs.enabled</name>
  <value>true</value>
</property>
<property>
  <name>dfs.namenode.http-address</name>
  <value>0.0.0.0:50070</value>
</property>

into hdfs-site.xml

then go to http://hostname:50070 to access webHDFS UI from there you can check everything.

Upvotes: 3

genifer
genifer

Reputation: 31

There should be some connections between two machines. Either configure SSH or HttpFS

Upvotes: 1

Related Questions