Reputation: 6038
Tomcat.exe is consuming 75% of CPU. Is anyone having any idea why it happens and how can that be decreased?
I am using Tomcat5.5 & J2SDK v 1.4.2_12
Upvotes: 15
Views: 121202
Reputation: 138
I recommend to look at the Tomcat log files, especially file called catalina.out. In my case that file was expanding fast with error messages saying "no permissions to read folder /var/lib/mysql". My application includes a watch service that monitors folder /var/lib/mysql. By allowing the application to read this folder the CPU usage came down drastically. High CPU usage appeared after a system update that can change permissions to access folders and files. Thus the reason for high CPU usage can be external to the web application and also external to Tomcat container.
Upvotes: 0
Reputation: 1547
In my case, I had just installed Tomcat8 with default settings. I had to set memory parameters -Xms -Xmx
. Once, I increased memory allocation to JVM, CPU utilization came down drastically.
Upvotes: 1
Reputation: 649
We just solved a problem with our tomcat instance running with very high cpu usage, swinging up to 100% and more every few seconds and then briefly down again. This occurred all day long and all night long, whether the server was performing any work or not. We are running tomcat 8 with java 8.
We did not find our solution in a web search, so I am posting it here in the hopes of helping someone else.
We had used the configuration option in the tomcat/conf/Catalina/localhost directory, in which we pointed tomcat to another directory, other than its own webapps directory. The xml files in this directory look like this:
<?xml version='1.0'?>
<Context
docBase="/opt/dspace/amaddev/dspace-6.3/webapps/jspui"
reloadable="true"
cachingAllowed="false"/>
And this worked, tomcat ran with the code in these directories rather than its own webapps directory. However, we had this problem with continually spiking cpu usage.
To test, we removed the xml files from the conf/Catalina/localhost directory, and restarted tomcat. Suddenly we had a normal, well-behaved tomcat again. In order to point the tomcat to this other directory (where we compile our dspace code), we just used the Host entry in conf/server.xml, and changed the appBase setting to our dspace directory:
<Host name="localhost" appBase="/opt/dspace/amaddev/dspace-6.3/webapps"
unpackWARs="true" autoDeploy="true">
This now accomplishes what we wanted with a very low cpu usage (below 1%) when the server is quiet.
Upvotes: 3
Reputation: 51
My logs were full of Tomcat logs. I deleted all of them and the CPU usage has gone dramatically down.
Upvotes: 5
Reputation: 137
First of all (this applies to all java applications) you must pin down which thread is using CPU. This is possible in JDK 1.6. It is done by using java.lang.management.ManagementFactory.getThreadMXBean(). Here is example usage (JSP):
<%@ page import="java.lang.management.*, java.util.*" %>
<%!
Map cpuTimes = new HashMap();
Map cpuTimeFetch = new HashMap();
%><%
long cpus = Runtime.getRuntime().availableProcessors();
ThreadMXBean threads = ManagementFactory.getThreadMXBean();
long now = System.currentTimeMillis();
ThreadInfo[] t = threads.dumpAllThreads(false, false);
for (int i = 0; i < t.length; i++) {
long id = t[i].getThreadId();
Long idid = new Long(id);
long current = 0;
if (cpuTimes.get(idid) != null) {
long prev = ((Long) cpuTimes.get(idid)).longValue();
current = threads.getThreadCpuTime(t[i].getThreadId());
long catchTime = ((Long) cpuTimeFetch.get(idid)).longValue();
double percent = (current - prev) / ((now - catchTime) * cpus * 10000);
if (percent > 0 && prev > 0) {
out.println("<li>" + t[i].getThreadName() + " " + percent + " (" + prev + ", " + current + ")");
}
}
cpuTimes.put(idid, new Long(current));
cpuTimeFetch.put(idid, new Long(now));
}
%>
After that you can get a thread dump and analyze the code in this thread to fix excessive CPU usage.
Upvotes: 3
Reputation: 104168
Lambda Probe is a very handy tool for monitoring Tomcat.
Are you using a quad CPU system? Probably Tomcat is running 100% in 3 of them. I would first test for an infinite loop or something like that in an application.
Upvotes: 4
Reputation: 49629
All the answers cover how to do an exact diagnose, in addition I would add that, from my experience, a infinite loop in one of your applications is probably the culprit.
As J-16 SDiZ said, your best bet is to run the profiler to narrow down the problem to one application.
Upvotes: 2
Reputation: 77121
If you're using 75% CPU and dont understand why, I suggest you issue a kill -3 to the tomcat process (ctrl-break if you have a console) to get a thread dump (when the load is high!). In my experience most threads should either be idle or in io-wait. Look for any single branch of code that has repeated occurences in the stack traces and that's your likely culprit (non-io waits!). This is the "poor man's profiler" that is quite often the best and most efficient way to solve these problems.
Upvotes: 11
Reputation: 26910
To understand what's happening, you should try to run it under a profiler. Try the YourKit (http://www.yourkit.com/) or Netbeans (http://profiler.netbeans.org/docs/help/5.5/profile_j2ee_profileproject.html).
The YourKit one have better integration with tomcat.
Upvotes: 4
Reputation: 34271
This is most likely caused by the application(s) that you are running on top of the tomcat. Of course if you have very high traffic on your applications, this could also be the reason.
Upvotes: 0