Saurabh Gokhale
Saurabh Gokhale

Reputation: 46415

Hadoop Directory with Spaces

I'm running into a problem while providing Hadoop a directory that contains spaces.

e.g

inputDir = /abc/xyz/folder name/abc.txt

Hadoop somehow doesn't know about the "folder name" being the name of the folder with spaces between words.

I get the below error while doing that

java.io.FileNotFoundException: File does not exist: /abc/xyz/folder

Also, I tried providing with URL encoded.

java.io.FileNotFoundException: File does not exist: /abc/xyz/folder%20name/abc.txt

But still throws me the same error.

Does anybody know the workaround for this ?

Any help is appreciated.

Upvotes: 2

Views: 6325

Answers (4)

cherah30
cherah30

Reputation: 476

inputDir = "/abc/xyz/folder name/" 

must work

hadoop fs -ls "/abc/xyz/folder name/"

works fine

Upvotes: 0

Ravindra babu
Ravindra babu

Reputation: 38950

Hadoop does not support empty spaces in input directory paths.

Replace space with _ or your preferred separator character in your directory paths.

Upvotes: 2

hrobertv
hrobertv

Reputation: 158

Replacing the space with %20 works for Hadoop shell. As in

sed 's/ /\%20/g'

And in the actual put command

hadoop fs -put "$inputDir" $putDest

Without the %20 you get a URI exception. (Which gave me my clue to use %20 over an escape character \ .)

I realize you're doing via Java. The fact that you're getting a java.io.FileNotFoundException makes me wonder if the code is doing something else with inputDir as opposed to being just the argument to the hadoop put, or an equivalent command of put. If it does any kind of checking of inputDir outside of Hadoop commands it will fail. Java sees it as a path. Hadoop sees it as a URI.

Upvotes: 2

SSaikia_JtheRocker
SSaikia_JtheRocker

Reputation: 5063

Try setting using set("path", "/abc/xyz/folder\\ name/abc.txt"); Kindly, note the double back slash.

Upvotes: 0

Related Questions