James
James

Reputation: 3184

Why does RES memory keep slowly growing for Java processes even for out of the box Spring Boot admin?

I have 23 Java processes running on one machine with 32GB. No process specifies JVM memory params such as Xmx. java -XX:+PrintFlagsFinal -version | grep MaxHeapSize reports that max default heap size is 8GB as expected.

Every process runs embedded Tomcat (Spring Boot apps (most at v 2.3.4)) except one is a standalone tomcat 9 instance running three WARs. These apps have low usage (usually one user and 10 minutes use a day). They are not memory or CPU intensive. One of them is Spring Boot admin and another is Spring Cloud's Eureka service registry. For these two, I have only a main method that simply bootstraps the Spring Boot application.

Yet, RES memory as shown in top for every process keeps gradually increasing. For example, Spring Boot service registry has increased from 1.1GB to 1.5GB in the last 12 hours. All processes show a similar small increase in RES but the total increase has reduced free memory by 2 GB in that same 12 hour period. This was the same in the previous 12 hours (and so on)until current free memory is now only 4.7GB.

My concern is that I continue to see this trend (even without app usage). Memory is never freed from the apps so total free memory continues to decrease. Is this normal since perhaps each JVM sees that memory is still available in the OS and that 8GB heap space is available to it? Will the JVMs stop taking memory at some point say once an OS free memory threshold is reached? Or will it continue until all free memory is used?

Update

The heap used for most apps is under 200MB but the heap size is 1.5 - 2.8GB. Heap max is 8GB.

Upvotes: 5

Views: 2030

Answers (3)

jbindel
jbindel

Reputation: 5635

The accepted answer here may help you.

Growing resident memory usage (RSS) of Java Process

Setting these for the tomcat9 JVM helped my own situation where a low-usage, low-memory application would nonetheless continue to consume RSS according to top on Linux. These environment variables disable some of the dynamic Glibc heuristics for releasing memory, which it won't do in some Tomcat scenarios.

export MALLOC_ARENA_MAX=4
export MALLOC_MMAP_THRESHOLD_=131072
export MALLOC_TRIM_THRESHOLD_=131072
export MALLOC_TOP_PAD_=131072
export MALLOC_MMAP_MAX_=65536

A workaround in current JVMs (at least for Java 11 forward) is to use the System.trim_native_heap option to jcmd:

sudo jcmd <pid> System.trim_native_heap

There is some discussion on the issues with Glibc not returning memory to the OS, and it looks like you may be using Linux.

https://bugs.openjdk.org/browse/JDK-8269345

https://bugs.openjdk.org/browse/JDK-8293114

https://marc.info/?l=openjdk-serviceability-dev&m=168879617126137

Upvotes: 0

Digao
Digao

Reputation: 560

I also faced this situation and after a long time of research I found the solution here. Basically, for my case, it was just a matter of setting xms and xmx parameters on the jar invocation, forcing the GC to act constantly.

Upvotes: 0

the8472
the8472

Reputation: 43125

Resident memory as reported by the OS doesn't tell you what component is consuming it. You'll have to gather additional data to figure out which part of the process is growing

You'll have to track

  • java heap and metaspace use - you can monitor this with JMC, gc logging and many other java monitoring tools
  • jvm off-heap use - NMT
  • direct byte buffer use - MX beans, also available via JMC
  • use by mapped files - pmap -x <pid>
  • use by native libraries e.g. used via JNI - difficult to monitor

Upvotes: 2

Related Questions