Prince Gupta
Prince Gupta

Reputation: 41

How can I analyse large size heap dump around of 35-40 GB

I have to analyse java heap dump of size 35-40GB, which can't be loaded on local machine except of remote servers of large memory.

I found Tool for analyzing large Java heap dumps as the best link till now. But after configuring all the things and properly executing all the command lines, I was not able to get any report file.

My ParseHeapDump.sh file looks as

#!/bin/sh
#
# This script parses a heap dump.
#
# Usage: ParseHeapDump.sh <path/to/dump.hprof> [report]*
#
# The leak report has the id org.eclipse.mat.api:suspects
# The top component report has the id org.eclipse.mat.api:top_components
#
./MemoryAnalyzer -consolelog -application org.eclipse.mat.api.parse "$@" -vmargs -Xms8g -Xmx10g -XX:-UseGCOverheadLimit

and MemoryAnalyzer.ini file looks as

-startup
plugins/org.eclipse.equinox.launcher_1.5.0.v20180512-1130.jar
--launcher.library
plugins/org.eclipse.equinox.launcher.gtk.linux.x86_64_1.1.700.v20180518-1200
java -Xmx8g -Xms10g -jar plugins/org.eclipse.equinox.launcher_1.5.0.v20180512-1130.jar -consoleLog -consolelog -application org.eclipse.mat.api.parse "$@"
-vmargs
-Xms8g
-Xmx10g

Please tell me If I'm doing any mistake in configuration or suggest me any other tool available in the market.

Upvotes: 4

Views: 2900

Answers (2)

Vishy Anand
Vishy Anand

Reputation: 133

Challenge is that RAM should be larger than the heap dump hprof file. And usually our laptop hosting windows system has RAM <=16 GB. Thus analyzing heap dump of 35-40 gb in local system is almost impossible.

Configure MAT in your remote unix server having enough RAM ( > 35-40GB). And run it using command line. (GUI is always slower than command line anyway.)

Assigning 8GB-10GB wont work. So Better to increase the heap assigned to java process.

-vmargs -Xms40g -Xmx40g

Check by descriptive answer here. https://stackoverflow.com/a/76298700/5140851

Upvotes: 0

Alexey Ragozin
Alexey Ragozin

Reputation: 8379

Processing large heap dump is a challenge. Both VisualVM and Eclipse Memory Analyzer required too much memory to process heap dumps in order of few dozen of GiB.

Commercial profilers show better result (YourKit in particular) though I not sure of their practical limit.

To routinely process 100+ GiB, I came up you with headless solution heaplib, which based on code base from VisualVM (Netbeans actually).

Heaplib is neigther graphical, nor interactive. It is oriented to automated reporting. Tool allows you to write code for heap analysis in OQL/JavaScript (or Java if you wish), though capabilities are limited to accommodate memory requirements. Processing of 100GiB could take hour, but for non interactive workflow it is acceptable.

Upvotes: 3

Related Questions