How can I generate java heap usage graph from log file? - java

I'm dealing with OOME of our company's service, and now I have hprof and gc.log file.
I'd like to see the change of each heap space's usage, just like the image below.
But I don't know how I can get such a graph.
MAT doesn't have such a feature. I know some other GUI analysis tool, but they are only for real-time monitoring, not for generating graph from existing log file.
Anyone knows other good way?

If you have a memory dump file try jvisualvm, some additional plugins may help with more insights i.e. https://dzone.com/articles/visualvm-gcviewer-plugin

Related

What is the best format for Heapdump file to analyze? Is it hprof?

I was assigned to find the root cause for Full GC issue (Garbage Collector) in our production environment. It occurs randomly and I believe the most probable case as a memory leak present in the current application. I hope to take a memory dump from our production environment (linux) and analyze using GUI tools such as Eclipse Memory Analyzer.
what is the best file format for a heap dump file to be analyzed? Is it hprof format? I am going to use jmap command to obtain a heap dump. Is it necessary to specify the "format=b" option while obtaining the memory dump?
Following is a sample command I am going to instruct our support team to run. (5980 is a sample pid).
If you believe in any better approach please let me know.
jmap -dump:format=b,file=hpdump.hprof 5980
Thanks
the command seems to be fine. Yes there are few other tools available to analyze the heap dump.
please refer this
I was not able to take a heap dump files as space limitations in production environment. Analyzing histogram files also took me nowhere. Instead I was able to find GC logs analyzer tool, IBM GCMV
that helps to analyze GC logs on some identified days of the year that caused Full GC issue.
Plotted graphs of the GC logs revealed the whole story. Meteoric rise of memory consumption could be observed in every scenario and it just took only below 10 mins to reach out maximum memory from the beginning of a certain event. I calculated time stamp of the initial rising points and compared with the server logs for corresponding time stamps. That provided strong evidences on few certain processes that frequently appeared in server logs when the rising commenced.

analyse a HPROF memory dump file from command line programmatically

I was investigation with analyzing a HPROF file using Eclipse's Memory Analyser (MAT).
The dominator tree, reports and the OQL interface look really useful. But all this has to be done manually from the MAT software.
Is there a commandline interface so I can programmatically parse the HPROF and automatically generate custom reports.
This would be useful to integrate this a test infrastructure to do a automatic memory analysis.
Btw, the heapsize is will be between 10-60MB.
ParseHeapDump.sh does what you're looking for. As for the follow up question I'm not sure what format the index files are stored in.
See bitbucket.org/joebowbeer/andromat, which is adapted from bitbucket.org/ekabanov/mat, which is a stripped-down command line version of Eclipse Memory Analyzer.

How we check the past record of the usage of Memory and CPU by JVM

I want to write an admin tool for a website. In which i want to show the memory usage and CPU usage of the website in a graph. I read somewhere that JVM writes these data in a dump file which MAT tool use to show the result. So my question is where JVM dump this file and at what rate.
Because i want to show on the graph the last 24 hour usage. So how we can capture this data.
Looking for help
There's nothing at the moment inherent in the JVM that will produce the stream of data you want. You could leave a profiler such as JVisualVM hooked up to your app, but that might also slow it down considerably (that said, for memory profiling it's not so bad... depends on the non-functional requirements of your app).
Two further options:
Write your own code using MBean stuff
In a ScheduledExecutorService, run a periodic process that interrogates the JVM via the MBean interfaces. You can generate a heap dump that can be viewing in MAT.
Use cron and jmap
You can also use jmap to generate heap dumps and schedule that at the operating system level to run every once-in-a-while. This might be better if you don't want to, or can't, alter your code.
AFAIK, if you want historical data you must record this yourself. When you connect using VisualVM, you always start at the point you connected.
The MAT tool can be used to examine a heap dump. It only works on a memory snapshot and is painful to use IMHO.

Viewing live heap in Eclipse

Is it possible to see the heap of a program in eclipse itself while it is executing? Is there a plugin for that?
I don't know if there is an Eclipse plugin, but if what matters is getting the information and not necessarily through Eclipse then you can do that with JVisualVM, and there are several plugins that provide all the details that you want.
One of its features is that you can make a heap dump.
Documentation says:
Take and browse heap dumps. When you need to browse contents of application memory or uncover a memory leak in your application,
you'll find the built-in HeapWalker tool really handy. It can read
files written in hprof format and is also able to browse heap dumps
created by the JVM on an OutOfMemoryException.
Eclipse does have a plugin called Eclipse Memory Analyzer (MAT). You can check it out here. I heard it is quite handy for heap analysis and fixing memory leaks in your program.
http://www.eclipse.org/mat/

Tool for analyzing large Java heap dumps

I have a HotSpot JVM heap dump that I would like to analyze. The VM ran with -Xmx31g, and the heap dump file is 48 GB large.
I won't even try jhat, as it requires about five times the heap memory (that would be 240 GB in my case) and is awfully slow.
Eclipse MAT crashes with an ArrayIndexOutOfBoundsException after analyzing the heap dump for several hours.
What other tools are available for that task? A suite of command line tools would be best, consisting of one program that transforms the heap dump into efficient data structures for analysis, combined with several other tools that work on the pre-structured data.
Normally, what I use is ParseHeapDump.sh included within Eclipse Memory Analyzer and described here, and I do that onto one our more beefed up servers (download and copy over the linux .zip distro, unzip there). The shell script needs less resources than parsing the heap from the GUI, plus you can run it on your beefy server with more resources (you can allocate more resources by adding something like -vmargs -Xmx40g -XX:-UseGCOverheadLimit to the end of the last line of the script.
For instance, the last line of that file might look like this after modification
./MemoryAnalyzer -consolelog -application org.eclipse.mat.api.parse "$#" -vmargs -Xmx40g -XX:-UseGCOverheadLimit
Run it like ./path/to/ParseHeapDump.sh ../today_heap_dump/jvm.hprof
After that succeeds, it creates a number of "index" files next to the .hprof file.
After creating the indices, I try to generate reports from that and scp those reports to my local machines and try to see if I can find the culprit just by that (not just the reports, not the indices). Here's a tutorial on creating the reports.
Example report:
./ParseHeapDump.sh ../today_heap_dump/jvm.hprof org.eclipse.mat.api:suspects
Other report options:
org.eclipse.mat.api:overview and org.eclipse.mat.api:top_components
If those reports are not enough and if I need some more digging (i.e. let's say via oql), I scp the indices as well as hprof file to my local machine, and then open the heap dump (with the indices in the same directory as the heap dump) with my Eclipse MAT GUI. From there, it does not need too much memory to run.
EDIT:
I just liked to add two notes :
As far as I know, only the generation of the indices is the memory intensive part of Eclipse MAT. After you have the indices, most of your processing from Eclipse MAT would not need that much memory.
Doing this on a shell script means I can do it on a headless server (and I normally do it on a headless server as well, because they're normally the most powerful ones). And if you have a server that can generate a heap dump of that size, chances are, you have another server out there that can process that much of a heap dump as well.
First step: increase the amount of RAM you are allocating to MAT. By default it's not very much and it can't open large files.
In case of using MAT on MAC (OSX) you'll have file MemoryAnalyzer.ini file in MemoryAnalyzer.app/Contents/MacOS. It wasn't working for me to make adjustments to that file and have them "take". You can instead create a modified startup command/shell script based on content of this file and run it from that directory. In my case I wanted 20 GB heap:
./MemoryAnalyzer -vmargs -Xmx20g --XX:-UseGCOverheadLimit ... other params desired
Just run this command/script from Contents/MacOS directory via terminal, to start the GUI with more RAM available.
I suggest trying YourKit. It usually needs a little less memory than the heap dump size (it indexes it and uses that information to retrieve what you want)
The accepted answer to this related question should provide a good start for you (if you have access to the running process, generates live jmap histograms instead of heap dumps, it's very fast):
Method for finding memory leak in large Java heap dumps
Most other heap analysers (I use IBM http://www.alphaworks.ibm.com/tech/heapanalyzer) require at least a percentage of RAM more than the heap if you're expecting a nice GUI tool.
Other than that, many developers use alternative approaches, like live stack analysis to get an idea of what's going on.
Although I must question why your heaps are so large? The effect on allocation and garbage collection must be massive. I'd bet a large percentage of what's in your heap should actually be stored in a database / a persistent cache etc etc.
This person http://blog.ragozin.info/2015/02/programatic-heapdump-analysis.html
wrote a custom "heap analyzer" that just exposes a "query style" interface through the heap dump file, instead of actually loading the file into memory.
https://github.com/aragozin/heaplib
Though I don't know if "query language" is better than the eclipse OQL mentioned in the accepted answer here.
The latest snapshot build of Eclipse Memory Analyzer has a facility to randomly discard a certain percentage of objects to reduce memory consumption and allow the remaining objects to be analyzed. See Bug 563960 and the nightly snapshot build to test this facility before it is included in the next release of MAT. Update: it is now included in released version 1.11.0.
A not so well known tool - http://dr-brenschede.de/bheapsampler/ works well for large heaps. It works by sampling so it doesn't have to read the entire thing, though a bit finicky.
This is not a command line solution, however I like the tools:
Copy the heap dump to a server large enough to host it. It is very well possible that the original server can be used.
Enter the server via ssh -X to run the graphical tool remotely and use jvisualvm from the Java binary directory to load the .hprof file of the heap dump.
The tool does not load the complete heap dump into memory at once, but loads parts when they are required. Of course, if you look around enough in the file the required memory will finally reach the size of the heap dump.
I came across an interesting tool called JXray. It provides limited evaluation trial license. Found it very useful to find memory leaks. You may give it a shot.
Try using jprofiler , its works good in analyzing large .hprof, I have tried with file sized around 22 GB.
https://www.ej-technologies.com/products/jprofiler/overview.html
$499/dev license but has a free 10 day evaluation
When the problem can be "easily" reproduced, one unmentioned alternative is to take heap dumps before memory grows that big (e.g., jmap -dump:format=b,file=heap.bin <pid>).
In many cases you will already get an idea of what's going on without waiting for an OOM.
In addition, MAT provides a feature to compare different snapshots, which can come handy (see https://stackoverflow.com/a/55926302/898154 for instructions and a description).

Categories

Resources