Reputation: 31
I see in one of the Java application startup command line the following parameter (IBM JVM) -Xcodecache.
Although intuitively I understand that this parameter controlls the size of the cache for code modules, I am unable to find the description for this parameter in any documentation.
In addition, how this parameter relates to servlet caching that is done using different setting?Thanks for your help.
Upvotes: 3
Views: 559
Reputation: 27242
Reading from IBM docs it looks like that how much JIT'd code you can have:
The compiled code is placed into a part of the JVM process space called the code cache; the location of the method within the code cache is recorded, so that future calls to it will invoke the compiled code.
Upvotes: 0
Reputation: 1963
Take a look at this documentation.
It looks like the IBM codecache is where it stores the native JIT compiled code generated from classes. The JVM stores this in a separate chunk of memory from its other resources.
For many applications this "codecache" memory will be fairly stable once most methods have been JIT compiled. The IBM JVM is smart about how it does this and tries to only use a litle bit more memory than it needs. However, early in the JVM life cycle while methods are still being JIT compiled, or when code generates new classes or uses reflection this memory will be consumed. When the JVM determines that it doesn't have enough space in the currently allocated codecache it will allocate a new block to increase the size. The -Xcodecache
parameter sets the size of these new chunks that are allocated.
This is a relevant passage from the linked documentation:
The JIT compiler uses memory intelligently. When the code cache is initialized, it consumes relatively little memory. As more methods are compiled into native code, the code cache is grown dynamically to accommodate the needs of the program. Space previously occupied by discarded or recompiled methods is reclaimed and reused. When the size of the code cache reaches a predefined upper limit, it stops growing. The JIT compiler will stop all future attempts to compile methods, to avoid exhausting the system memory and affecting the stability of the application or the operating system.
The documentation says that the default value is specific to the architecture, which makes sense since you will probably want different size chunks allocated if you are running a 64 or 32 bit system. When you would want to adjust this value is when you are running an application which is regularly loading new classes into the JVM. In that case you may want to increase the value so that new allocations happen less frequently. You shouldn't set this too high, though, since you don't want to allocate much more memory than you will use. The documentation suggests that
A reasonable starting point to tune for the optimal size is (totalNumberByteOfCompiledMethods * 1.1).
Upvotes: 3