godspeed
godspeed

Reputation: 1

hadoop can not access the s3

I have a question about hadoop access s3 on aws.

<property>
<name>fs.default.name</name>
<value>s3n://testhadoophiveserver</value>
</property>

<property>

<name>fs.s3n.awsAccessKeyId</name>
<value>I have fill it</value>
</property>
<property>
<name>fs.s3n.awsSecretAccessKey</name>
<value>I have fill it</value>
</property>

so .I got a error code when I run start-all.sh. like this :

hadoopmaster: Exception in thread "main" java.net.UnknownHostException: unknown host: testhadoophiveserver
hadoopmaster:   at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:195)

hadoopmaster:   at org.apache.hadoop.ipc.Client.getConnection(Client.java:850)

adoopmaster:    at org.apache.hadoop.ipc.Client.call(Client.java:720)

hadoopmaster:   at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)

hadoopmaster:   at $Proxy4.getProtocolVersion(Unknown Source)

hadoopmaster:   at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
hadoopmaster:   at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:346)
hadoopmaster:   at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:383)
hadoopmaster:   at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:314)

but ,if I use HDFS ,it's ok. now ,I can not use S3 filesystem. who can help me?

Upvotes: 0

Views: 606

Answers (1)

yanbohappy
yanbohappy

Reputation: 94

I think you should not run " start-all.sh". The scripts " start-all.sh" include the code of start HDFS and MapReduce. It not need to start HDFS if you have configured to use S3 as the underlying storage layer. The start-dfs.sh is called by start-all.sh, so it will execute the code to start HDFS which you did not configure.

Upvotes: 1

Related Questions