amy
amy

Reputation: 65

flume cannot connect to HDFS port 9099

I am trying to access the log files HDFS using flume.I am connected to port 9099 but I donno why flume trying to connect 8020 I am getting following errors:

java.net.ConnectException: Call From localhost.localdomain/127.0.0.1 to localhost:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused

NameNode is listening on port 9099 with netstat -tlpn | grep :9099

I think the way to set this is to format namenode and set the port to 8020 but I dont want to do that as it will format everything. Please help

Upvotes: 1

Views: 835

Answers (2)

user3504158
user3504158

Reputation: 89

8020 is a default port; To override it you can use flume-conf.properties. Update your config with

kafkaTier1.sinks.KafkaHadoopSink.hdfs.path = hdfs://NAME_NODE_HOST:PORT/flume/kafkaEvents/%y-%m-%d/%H%M/%S

Upvotes: 0

Mr.Chowdary
Mr.Chowdary

Reputation: 3407

8020 is the default port for running name node.

You can change this in core-site.xml for the property fs.default.name As you mentioned it is running on 9099 port. check once whether it is mentioned here or not.
Check for flume configuration file which specifies namenode details. you can just stop the cluster and change the port number to default and restart it. No need to format the namenode for this.I had tested the same before answering your question.
Hope it helps!

Upvotes: 0

Related Questions