Reputation: 33495
I am trying to get the events from Log4J 1x into HDFS through Flume using the Log4J Flume appender. Created two appenders FILE and flume. It works for the FILE appender, but with the flume appender the program just hangs in Eclipse. Flume works properly, I am able to send messages to the avro source using the avro client and see the messages in HDFS. But, it's not getting integrated with Log4J 1x.
I don't see any exception, except the below in the log.out.
Batch size string = null
Using Netty bootstrap options: {tcpNoDelay=true, connectTimeoutMillis=20000}
Connecting to localhost/127.0.0.1:41414
[id: 0x52a00770] OPEN
and from the Flume console
2013-10-23 14:32:32,145 (pool-5-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 => /127.0.0.1:41414] OPEN
2013-10-23 14:32:32,148 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 => /127.0.0.1:41414] BOUND: /127.0.0.1:41414
2013-10-23 14:32:32,148 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 => /127.0.0.1:41414] CONNECTED: /127.0.0.1:46037
2013-10-23 14:32:43,086 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 :> /127.0.0.1:41414] DISCONNECTED
2013-10-23 14:32:43,096 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 :> /127.0.0.1:41414] UNBOUND
2013-10-23 14:32:43,096 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 :> /127.0.0.1:41414] CLOSED
2013-10-23 14:32:43,097 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.channelClosed(NettyServer.java:209)] Connection to /127.0.0.1:46037 disconnected.
If it helps I did run the program in debug mode and when it hangs, I did a suspend and took the stack trace. Tried to look into the code, but not sure why the program hangs with the flume appender.
Daemon Thread [Avro NettyTransceiver I/O Worker 1] (Suspended)
Logger(Category).callAppenders(LoggingEvent) line: 205
Logger(Category).forcedLog(String, Priority, Object, Throwable) line: 391
Logger(Category).log(String, Priority, Object, Throwable) line: 856
Log4jLoggerAdapter.debug(String) line: 209
NettyTransceiver$NettyClientAvroHandler.handleUpstream(ChannelHandlerContext, ChannelEvent) line: 491
DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline$DefaultChannelHandlerContext, ChannelEvent) line: 564
DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(ChannelEvent) line: 792
NettyTransportCodec$NettyFrameDecoder(SimpleChannelUpstreamHandler).channelBound(ChannelHandlerContext, ChannelStateEvent) line: 166
NettyTransportCodec$NettyFrameDecoder(SimpleChannelUpstreamHandler).handleUpstream(ChannelHandlerContext, ChannelEvent) line: 98
DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline$DefaultChannelHandlerContext, ChannelEvent) line: 564
DefaultChannelPipeline.sendUpstream(ChannelEvent) line: 559
Channels.fireChannelBound(Channel, SocketAddress) line: 199
NioWorker$RegisterTask.run() line: 191
NioWorker(AbstractNioWorker).processRegisterTaskQueue() line: 329
NioWorker(AbstractNioWorker).run() line: 235
NioWorker.run() line: 38
DeadLockProofWorker$1.run() line: 42
ThreadPoolExecutor.runWorker(ThreadPoolExecutor$Worker) line: 1145
ThreadPoolExecutor$Worker.run() line: 615
Thread.run() line: 744
Here is the Java program
import java.io.IOException;
import java.sql.SQLException;
import org.apache.log4j.Logger;
public class log4jExample {
static Logger log = Logger.getRootLogger();
public static void main(String[] args) throws IOException, SQLException {
log.debug("Hello this is an debug message");
}
}
Here is the log4j.properties
# Define the root logger with appender file
log = /home/vm4learning/WorkSpace/BigData/Log4J-Example/log
log4j.rootLogger = DEBUG, FILE, flume
# Define the file appender
log4j.appender.FILE=org.apache.log4j.FileAppender
log4j.appender.FILE.File=${log}/log.out
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.conversionPattern=%m%n
# Define the flume appender
log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname = localhost
log4j.appender.flume.Port = 41414
log4j.appender.flume.UnsafeMode = false
log4j.appender.flume.layout=org.apache.log4j.PatternLayout
log4j.appender.flume.layout.ConversionPattern=%m%n
Here are the dependencies in Eclipse
flume-ng-log4jappender-1.4.0.jar
log4j-1.2.17.jar
flume-ng-sdk-1.4.0.jar
avro-1.7.3.jar
netty-3.4.0.Final.jar
avro-ipc-1.7.3.jar
slf4j-api-1.6.1.jar
slf4j-log4j12-1.6.1.jar
Here is the flume.conf content
# Tell agent1 which ones we want to activate.
agent1.channels = ch1
agent1.sources = avro-source1
agent1.sinks = hdfs-sink1
# Define a memory channel called ch1 on agent1
agent1.channels.ch1.type = memory
# Define an Avro source called avro-source1 on agent1 and tell it
# to bind to 0.0.0.0:41414. Connect it to channel ch1.
agent1.sources.avro-source1.type = avro
agent1.sources.avro-source1.bind = 0.0.0.0
agent1.sources.avro-source1.port = 41414
# Define a logger sink that simply logs all events it receives
# and connect it to the other end of the same channel.
agent1.sinks.hdfs-sink1.type = hdfs
agent1.sinks.hdfs-sink1.hdfs.path = hdfs://localhost:9000/flume/events/
agent1.sinks.hdfs-sink1.channel = ch1
agent1.sources.avro-source1.channels = ch1
How to get around this problem?
Upvotes: 2
Views: 4127
Reputation: 3769
I had a similar problem,the solution is :
but i didn't know what happend inside flume-ng. i'm trying to debug it. If anyone knows,tell me thx ~~
Upvotes: 1
Reputation: 11
I had a similar problem using the Flume appender in log4j once. The program would hang whenever I tried to instantiate the Logger object. I recall that the problem was that I didn't have all the required libraries in the classpath, and once I added them, it worked fine.
I'd recommend that you start first with getting Mike Percy's simple example to work. Although the pom.xml there creates a single JAR with all the dependencies, editing it to copy the dependent jar files to another directory gives me this list:
avro-1.7.4.jar
avro-ipc-1.7.4.jar
commons-codec-1.3.jar
commons-collections-3.2.1.jar
commons-compress-1.4.1.jar
commons-lang-2.5.jar
commons-logging-1.1.1.jar
flume-ng-log4jappender-1.4.0-cdh4.5.0.jar
flume-ng-sdk-1.4.0-cdh4.5.0.jar
hamcrest-core-1.1.jar
httpclient-4.0.1.jar
httpcore-4.0.1.jar
jackson-core-asl-1.8.8.jar
jackson-mapper-asl-1.8.8.jar
jetty-6.1.26.jar
jetty-util-6.1.26.jar
junit-4.10.jar
libthrift-0.7.0.jar
log4j-1.2.16.jar
netty-3.5.0.Final.jar
paranamer-2.3.jar
slf4j-api-1.7.2.jar
slf4j-jdk14-1.7.2.jar
snappy-java-1.0.4.1.jar
velocity-1.7.jar
xz-1.0.jar
A few of those libraries (like junit) might not really be necessary, but I would suggest using all of them at first to see if you can get your example working, then experiment with determining the minimal required set afterwards.
Upvotes: 1
Reputation: 2366
My guess is that you're trying to log Flume's events through Flume. I've seen this problem with other appenders but not with the Log4j1 one.
I would consider modifying the log4j.properties to exclude Flume, Netty and Avro events and see if that fixes it.
Upvotes: 1