Reputation: 373
I have a server, written in java that working fine in Windows 7. But when I installed the same in Ubuntu, it shows out of memory error, server having 8 GB of ram and our server using only 70% of Ram, it shows out of memory error. And CPU usage is at the peak at out of memory time.
Out of memory error does not occur in Windows, only Ubuntu. In the windows Task manager shows 100% CPU usage and full RAM usage. I wish to implement this performance in Ubuntu.
Upvotes: 0
Views: 672
Reputation: 128899
If you mean that Java throws an OutOfMemoryError, you first need to understand why. You probably want to pick up a profiler like VisualVM (free) or YourKit (not free) to see what's going on in memory. After that, you can decide on the appropriate action(s), like making your app more memory-efficient or changing the JVM settings to increase the appropriate memory allocation.
To get you started, if you're running out of heap space (old generation), you can adjust the maximum heap available using the JVM flag -Xmx
. For example, -Xmx1024m
or -Xmx1g
will set the max to 1GB. If it's the permanent generation (PermGen), you can allocate a larger chunk of overall heap to that using the -XX:MaxPermSize
JVM flag, such as -XX:MaxPermSize=256m
. Note that increasing the size of the permanent generation will correspondingly decrease the amount of memory available to the other heap segments.
Full details on these and other JVM memory settings can be found on the Java application launcher page and the HotSpot VM Options page (assuming you're using the HotSpot VM).
Upvotes: 2