rayman
rayman

Reputation: 21616

Memory Leaks using Executors.newFixedThreadPool()

I am using Spring3.1 on a standalone env.

(this problem not necessary related to Spring. it's behaving the same also on a standalone env.)

I have implements a listener which receives messages from Topic. the messages rate is very very high (talking about 20/30 m/s).

some messages can take more processing time then other.

The listener works with the same instance which means that if one message being processed too long it hits our performances pretty much.

We thought about doing our own pool of objects instead of using the same listener instance but then I found out the Executors(java.util.concurrent.Executors).

So for each message received a different thread will be allocated to it. this will make sure our listener instance will be free to process messages parallel.

private ExecutorService  threadPool = Executors.newFixedThreadPool(100);
    @Override
    public void onMessage(final Message msg)
    {
        Runnable t = new Runnable()
        {
            public void run()
            {
                onSessionMessage(msg);
                log.trace("AbstractSessionBean, received messge");
            }
        };
        threadPool.execute(t);
    }

That seems to solve our performance issue. but after monitoring the application with jconsole we facing now huge memory leaks.

The heap memory usage being increased significantly in time.

So I tried to "play" a bit with the FixedThreadPool size number. still having huge memory usage:

enter image description here

Any idea how can I solve this? any other ideas to solve my key problem?

jconsole after performing GB

jconsole overall view

After running heap dump I got two problem suspects:

Head Dump

thanks, ray.

Upvotes: 7

Views: 21820

Answers (4)

Enda
Enda

Reputation: 39

I believe the issue you encountered is the threadPool not releasing its resources. You need to call threadPool.shutdown() after you are finished submitting or executing. This will wait until the tasks have completed before terminating threads which can then be garbage collected.

From official Java api website:

"An unused ExecutorService should be shut down to allow reclamation of its resources." https://docs.oracle.com/javase/7/docs/api/java/util/concurrent/ExecutorService.html#shutdown()

Alternatively you can use a newCachedThreadPool() which "Creates a thread pool that creates new threads as needed, but will reuse previously constructed threads when they are available" see https://docs.oracle.com/javase/7/docs/api/java/util/concurrent/Executors.html

When I encountered this problem, I went with the newFixedThreadPool() and shutdown option.

Upvotes: 3

Martin
Martin

Reputation: 11

Just a suggestion, but if you create 30 messages per second and it takes the computer (even in parallel) longer to process these 30 messages than 1 second, then youre queue of submitted tasks will grow uncontrollably. You should make sure no tasks are submitted if the queue size is larger than a set number and wait a bit. Each Message object uses memory and I think this could be your problem. A cachedthreadpool would not solve this.

You could quite simply test this by printing out youre queue size. Dont know if this was solved already....

Upvotes: 1

Taky
Taky

Reputation: 5344

I think you have no problem with memory leak, I cannot see it from your jconsole chart at least. There is no major GC collection. So it seems only more and more objet allocated into tenured(old) generation. To ensure about memory leak you should Perform GC and after that compare allocated memory. If you find a leak you are able to make a heap dump with jmap or visual tool(standart JDK tools). After this heap dump can be analyzed with MAT. Before taking heap dump it is better to Perform GC to decrease heap dump file size.

Some notes:

  • Threads count shouldn't affect heap memory explicitly. It maybe useful for you to review next java memory structure. So thread require stack memory not a heap.
  • In general it is not good idea to create cache for not heavy object, because of GC works algorithm.
  • Also I think you should consider cached thread pool or calibrate ThreadPoolSize according server hardware.

Upvotes: 2

Harish Raj
Harish Raj

Reputation: 1575

After the usage of ExecutorService, You need to stop the thread pool, as it is the same resource, like a file or a database or anything else that requires an explicit release. ExecutorService has methods shutdown() and shutdownNow() which can be used in a finally block to gargabe collect.

import java.util.concurrent.CountDownLatch;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

class SimpExec {
  public static void main(String args[]) {
    CountDownLatch countDownLatch = new CountDownLatch(5);
    CountDownLatch CountDownLatch2 = new CountDownLatch(5);

    ExecutorService eService = Executors.newFixedThreadPool(2);

    eService.execute(new MyThread(countDownLatch, "A"));
    eService.execute(new MyThread(CountDownLatch2, "B"));


    try {
      countDownLatch.await();
      CountDownLatch2.await();

    } catch (InterruptedException exc) {
      System.out.println(exc);
    }
      finally{

        eService.shutdown(); // This is the method we use to avoid memory Leaks.
        // eService.shutdownNow(); // -do-
       }
  }
}

class MyThread implements Runnable {
  String name;

  CountDownLatch latch;

  MyThread(CountDownLatch c, String n) {
    latch = c;
    name = n;
    new Thread(this);
  }

  public void run() {
    for (int i = 0; i < 5; i++) {
      latch.countDown();
    }
  }
}

If you forget to shutdown, the memory leak happens, It is also like streams that are not normally closed by JVM, because the default Executors does not create daemon threads.

Upvotes: 0

Related Questions