Reputation: 25996
Is there a tool that is able to collect and count (Java) stacktraces in a large logfile, such that you get an overview which errors occur most often?
Upvotes: 0
Views: 302
Reputation: 25996
Since haven't found something usable and I've had this problem again and again I wrote a StacktraceCounter that optionally filters out stacktrace lines from the container and collects the thus normalized stacktraces with the log lines that trigger them. That goes beyond what Dynatrace and Splunk can easily do.
The following Scala program StacktraceCounter.scala finds the log messages with (Java) stacktraces in the logfiles given as arguments, groups them after removing irrelevant lines, and counts them. In the end they are printed out in order of number of occurrences. You might have to customize the regular expressions for your log format, though.
#!/bin/sh
exec scala "$0" "$@"
!#
import java.io.File
import scala.collection.mutable.ArrayBuffer
import scala.io.Source
/** Counts the stacktraces in a logfile. The stacktraces are normalized, counted, sorted by descending frequency. */
object StacktraceCounter {
def main(args: Array[String]) = {
def isStackLine(line: String) = line.matches("""^\s+at .*|^Caused by: .*|^\S+... [0-9]+ more.*""")
def isLogEntryStart(line: String) = //line.matches("""^(202[0-9]-[01]\d-[0-3]\d|[0-3]\d\.[01]\d\.202[0-9] ).*""")
line.matches("""^\s*([0-9]{1,4}[-:., ]+){6,7}.*""") // rough but probably matches many formats
// possibly also com.adobe|com.day.cq
val uninterestingStackLines = ("""at java.lang.reflect|at org.apache.sling|at org.eclipse.jetty|at org.apache.jackrabbit|at org.apache.felix|at java.base/|org.quartz|org.apache.hc|at org.springframework|at \S+\$|at jdk.internal""").r
def groupToLogMessages(lines: Iterator[String]) =
new Iterator[List[String]] {
private val it = lines.dropWhile(!isLogEntryStart(_)).buffered
override def hasNext: Boolean = it.hasNext
override def next(): List[String] = {
val res = new ArrayBuffer[String]
if (it.hasNext) {
res += it.next()
while (it.hasNext && !isLogEntryStart(it.head)) res += it.next()
}
res.toList
}
}
val logmessages: Iterator[List[String]] = args.iterator.flatMap { arg =>
val source = if ("-" == arg) Source.stdin else Source.fromFile(new File(arg.trim), "UTF-8")
groupToLogMessages(source.getLines())
}
case class StacktraceInfo(heading: String, normalizedStacktrace: String)
println("Filtering " + uninterestingStackLines)
println()
def normalize(trace: List[String]): StacktraceInfo = StacktraceInfo(
trace.takeWhile(!isStackLine(_)).mkString("\n"),
trace.filter(isStackLine).filterNot(uninterestingStackLines.findFirstIn(_).isDefined).mkString("\n").intern()
)
val groupedLogMessages = {
val buf = logmessages.map(normalize).filterNot(_.normalizedStacktrace.isEmpty).toBuffer
buf.groupBy(_.normalizedStacktrace)
}
groupedLogMessages.toList.sortBy(-_._2.size) foreach { case (normalizedTrace, threadGroup) =>
println(threadGroup.size.formatted("==== %6d") + " ===============================================================")
println(normalizedTrace)
println("------------------")
threadGroup foreach { thread =>
println(thread.heading)
println()
}
}
}
}
Upvotes: -1
Reputation: 496
I'm not too sure if there is a tool available to evaluate log files but you may have more success with a tool like AppDynamics. This is a monitoring tool which can be used to evaluate Live application performance and can be configured to monitor exception frequency.
Good luck.
Mark.
Upvotes: 1
Reputation: 2745
I am not aware of any automatic tool but logmx will give you a nice clean overview of your log file with search options.
Upvotes: 3
Reputation: 53516
This probably isn't the best answer but I am going to try to answer the spirit of your question. You should try Dynatrace. It's not free and it doesn't work with log files per say but it can get you very detail reports of what types of exceptions are thrown from where and when on top of a lot of other info.
Upvotes: 1