oula alshiekh
oula alshiekh

Reputation: 893

webhdfs create file exception

I am using hadoop 2.7.1 on centos 7

If I want to use webhdfs with Hadoop cluster

I should configure this property

<property>
      <name>dfs.webhdfs.enabled</name>
      <value>true</value>
</property>

but what else?

my name node ip is 192.168.4.128

when I use Firefox on Windows 8 and send this get request

http://192.168.4.128:50070/webhdfs/v1/hadoopDir/A.txt/?user.name=SYSTEM&op=OPEN

I can open a file and see it's content

but when I send PUT request

http://192.168.4.128:50070/webhdfs/v1/HadoopDir/B.txt/?user.name=SYSTEM&op=CREATE&data=true

I get the following response

{"RemoteException":{"exception":"IllegalArgumentException","javaClassName":"java.lang.IllegalArgumentException","message":"Invalid value for webhdfs parameter \"op\": No enum constant org.apache.hadoop.hdfs.web.resources.GetOpParam.Op.CREATE"}}

why in GET operation which is opening the file everything is ok, but not in PUT.

Any other configurations should be handled to process this request?

Upvotes: 1

Views: 1370

Answers (1)

franklinsijo
franklinsijo

Reputation: 18270

WebHDFS performs four kinds of HTTP operations (GET, POST, PUT, DELETE) whereas Web Browsers normally do support only GET and POST.

Include some plugins like RESTClient (for Firefox), Advanced REST Client (for Chrome) for sending HTTP PUT and HTTP DELETE requests.

Alternatively, you can use curl from command line.

Note: WebHDFS is enabled by default. So, the property added in hdfs-site.xml is redundant (just saying).

Upvotes: 3

Related Questions