java: General approach to out-of-memory error - java

I have been working on this project on Java with multiple modules. Since quite some time, I have been occasionally getting "java: Out Of Memory" error! I am pretty new to this 'popular' error and wanted to know the general approach to solve such errors.
Also, are there standard tools accepted by the industry to help figure out the cause of such errors?
The modules in my project include every minute polling from a third party (using web service), multi-threading among other things. However, this is just a pointer and I seek a general approach and not something very specific to my project.
Thanks.

Sometimes you just have an class that uses a lot of memory and you need to increase the heap size or make a more space-efficient algorithm. Other times it is a leak and you need to deference objects.
Run jvisualvm (it's included in the JDK).
Connect to your process and try if you can to recreate the
out-of-memory error while keeping an eye on the heap size.
Perform a heap dump when the memory grows large. Search for the
largest objects by size - often that will give you the culprit
class.
Look at the dependencies to see what is holding a references. If it is a memory leak make sure to dereference unneeded objects.

Also, are there standard tools accepted by the industry to help figure out the cause of such errors?
Yes, there are memory profilers such as VisualVM and YourKit. I use the latter extensively, for both CPU and memory profiling, and find it extremely useful. To get some idea of what it's capable of, take a look at this page: link.

If you can't increase the available memory you have to consume less.
Don't keep references to Objects that you don't need at the time of execution (like data you can reload dynamically) and if necessary redesign your flow (e.g. don't process all objects in parallel and do it sequentially) to require less memory at that time. The garbage collection should do the rest for you.
Especially if you load big data objects into memory consider to use a streaming approach if possible. E.g. you don't need to load a whole file into memory if you want to search through it. You can just step through it.
Besides architectural problems you can also have leaks: keeping unintentional references to objects you don't need anymore. Since they are referenced, the garbage collector can't free the memory and you run out of memory at some point. That is probably the #1 reason for OutOfMemoryExceptions and it usually has to do with static references since classes and therefore the statics are usually not unloaded after the first time you touch a class. The internet has many articles on finding / fixing those, e.g. How to Fix Memory Leaks in Java
one tool I know of is MAT

You likely have a memory leak. Finding it is a challenge. Netbeans has some tools to help you profile the VM . You can profile your project and view men usage while it runs. Apache JMeter is also available as a plug-in or you can run it on its own.
JMeter.apache.org

If you get OOM too often, then start java with correct options, get a heap dump and analyze it with jhat or with memory analyzer from eclipse (http://www.eclipse.org/mat/)
-XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=<path to dump file>

Related

VisualVM memory Leaks?

I am trying to detect the memory leaks in my java application using VisualVM. I am using the VisualVM 1.3.5.
I followed the steps which should be said in this tutorial http://rejeev.blogspot.in/2009/04/analyzing-memory-leak-in-java.html
After following those steps, I don't know where I have to start edit my code. Is there any way to find the java class and the line number where the memory was leaked to correct the code.
Or any one suggest me a good way to find memory leaks using the VisualVM.
Good answers are definitely appreciated .
No profiling tool will give you the line where a potential memory leak is concurring.
Profiling an application takes a bit more effort than that. Usually, a tool like VisualVM will, for instance, show you what type of Objects are being instantiated the most, and that can indicate where the problem is.
For instance, if a huge amount of byte[] objects are being created, perhaps you're not closing the Input/Output streams you are creating?
There is no silver bullet to find memory leaks, it takes effort and some practice, and is completely application dependent.
That being said, this link might help as well:
http://www.kdgregory.com/index.php?page=java.outOfMemory

Eclipse Memory Analyzer - Leak Suspects Report doesn't point to MY classes - why?

I'm trying to determine whether or not I have a memory leak in my webapp. I'm using VisualVM and JMeter to load test and watch the heap.
I saved a heap dump to file and downloaded Eclipse Memory Analyzer yesterday...after much frustration with VisualVM, I thought Eclipse would pinpoint the leak, if any, better than VisualVM.
I opened the heap file in Eclipse and ran what they call a Leak Suspects Report. I thought it would point to a particular class in my webapp, but it doesn't. So I have no clue how to use the info its provided in order to find out where in any particular class of mine the leak suspect is.
Here's the results of the Leak Suspect Report for one of my heap dump files.
One instance of "org.apache.catalina.session.StandardManager" loaded by "org.apache.catalina.loader.StandardClassLoader # 0x261bdac0" occupies 16,977,376 (48.54%) bytes. The memory is accumulated in one instance of "java.util.concurrent.ConcurrentHashMap$Segment[]" loaded by "".
Keywords
org.apache.catalina.loader.StandardClassLoader # 0x261bdac0
org.apache.catalina.session.StandardManager
java.util.concurrent.ConcurrentHashMap$Segment[]
The rest of the Details in the report are as shown in the attached image. I hope the image can be expanded for a closer look....
I know that Eclipse is supposed to be really good software. This is my last attempt to use something like this to find a memory leak - I just have very, very, limited knowledge in HOW this software can be used for such. Tutorial and help pages describe things as though you should know what to do after a few clicks... I need more help than that.
While I don't have any experience with using Eclipse for finding leaks, I would ask a question first: How sure are you that you have a memory leak? From your question, it doesn't sound like you are sure you have a leak, but you are testing to see if you do have one. The simplest way to test that would be to start your application, note how much memory it is consuming, have JMeter hit it continuously for 24 hours, and see how much memory it is consuming (probably after executing GC). If your application is consuming a significantly large portion of memory, or has died from an OutOfMemoryError, then you have a memory leak.
If you find that you actually do have a memory leak, then I would first suggest running your application through FindBugs to see if it can find the memory leaks through a quick static analysis. If that doesn't work, then this article (although it is rather old) might help you understand the results given to you by Eclipse.

Memory Leak in a Java based application

There is a memory leak happens in an application when a short lived object holds a long lived object,
My question is how can we identify
1) which object lives longer and shorter, any tool which measures life of an object?
2nd Question
I am constantly getting the Out of Memory Space Error and I tried increasing the Heap memory to 2 GB, but still i am getting, please suggest me any open source tool with which i can identify the memory leak issue and fix.
At present I am restarting the server every time as a temporary solution, but Suggest me any thing which i can fix permanently.
You can use the VisualVM tool included in the JDK:
http://download.oracle.com/javase/6/docs/technotes/tools/share/jvisualvm.html
Documentation available here:
https://visualvm.dev.java.net/docindex.html
There are 2 options:
It just may be your application doesn't have enough heap allocated. Measure size of your input and give application corresponding heap;
There's memory-leak: take profiler, examine your heap, find objects which shouldn't be there or there too much of them ('short-living objects', in your terms), identify which 'long-living' object holds them, fix this. You should know your code to understand which objects must be 'short-living' and which must be 'long-living'.
I've found the Heap Walker in Netbeans very usefull
As said, jvisualvm have good tools to analyze the heap live.
But you can also use jvisualvm or -XX:+HeapDumpOnOutOfMemoryError to take a heap dump in a file. And then take the file to your destkop, to open it in Eclipse Memory Analyzer. Eclipse MAT is even better to analyze the memory.
Out of Memory occurs on a server because it literally uses up all memory it's allowed to have. Not sure about what application you're using for hosting the server, but for Apache, you need to add the line -Xmx512m where 512 is the maximum amount of megabytes it's allowed to have.
If you leave the application to run long enough, it's going to happen. This isn't because of memory leaks in Java but the server itself which has a tendency to do so. You can't change this behavior, but you can at least increase the default memory of 256 mb. With the heavy loading site that I work on everyday, 256 mb lasts about 30 minutes for me unfortunately. I've found that 1024 mb is reasonable and rarely crashes due to out of memory exceptions.
I'd strike me as very unusual for Java to be incapable of garbage collecting correctly unless the programmer took a hand at overriding typical functionality.
I think you can track memory leaks with jsconsole (which comes shipped with JDK6 if i'm not mistaken).
A short-lived object holding a reference to a long-lived object will not cause problems. (a good overview , including generational garbage collection).
2GB is an awful lot of objects/references. If you're running out of heap space at 2Gb you're likely holding onto massive amounts of data and/or keeping open resources when you're done with them. You should post at the very least a description of what your application does and how long it takes to die.
You can get some sense of what's happening quickly by watching the garbage collector (e.g. run with "-verbose:gc" which will tell you when the garbage collector is running and how much it collects).

Java App with lots of memory problems

I've written a pretty complex java application that is doing a lot of calculations on price data from the markets in real time and from looking at the task manager in windows this sucker is taking close to 1MEG every 30 seconds and the performance is fine until it gets closer to the memory limit around 300MEG and then the g-collector really kicks in and spikes my CPU to around 50% and the UI performance rapidly degrades from all I've written so far it sounds like I have some bad code going on because the nature of my program is CPU intensive but by design stores very little data in memory.
I need some help on what might be some good next steps to take to see how I can figure out what the problem is, I think if I can see what objects are getting stored in memory that would help as maybe I have some lousy code but I am heart broken with Java as I thought these are problems I would not have to worry about. Thanks for any answers. - Duncan
Identify some reasonable performance targets (memory usage, throughput, latency).
Put together some repeatable performance tests, the closer you can get these to real life scenarios the better.
Get a hold of a good profiler. I've used YourKit with a lot of success, the Netbeans and Eclipse profilers are not bad either. Most decent profilers will be able to identify memory usage, GC and performance hotspots.
Identify the biggest culprits and start fixing the issues beginning at the TOP of the list.
Check out VisualVM. It's in the current JDK bin directory as jvisualvm. If you don't have a memory leak, the heap usage should go down when you run the garbage collector, and you can see which objects may be holding memory by calculating the retained sizes of objects in the heap.
http://download.oracle.com/javase/6/docs/technotes/guides/visualvm/intro.html
Like others say, use a profiler to find what is consuming the memory.
If you don't know already, the garbage collector can only release memory on objects that are out of scope. That is, don't have any references to them. Just make sure it goes out of scope when your done with it. It sounds like your locking it up in a way were it's still referenced some where.
Also, if you want to suggest to the GC that it cleans up, try this:
System.gc();
System.runFinalization();
Again, that is only a suggestion to the gc; but I've found it really helps if you run it after a lot of objects go out of scope.
Lastly, you can tweak your vm arguments.
There are settings for min/max heap size settings. If it's a critical application set them to the same and set it high (that way it doesn't have to keep allocating/deallocating - it just grabs one big chunk at startup). This isn't a fix, just a workaround.

Strategies for the diagnosis of Java memory issues

I've been tasked with debugging a Java (J2SE) application which after some period of activity begins to throw OutOfMemory exceptions. I am new to Java, but have programming experience. I'm interested in getting your opinions on what a good approach to diagnosing a problem like this might be?
This far I've employed JConsole to get a picture of what's going on. I have a hunch that there are object which are not being released properly and therefor not being cleaned up during garbage collection.
Are there any tools I might use to get a picture of the object ecosystem? Where would you start?
I'd start with a proper Java profiler. JConsole is free, but it's nowhere near as full featured as the ones that cost money. I used JProfiler, and it was well worth the money. See https://stackoverflow.com/questions/14762/please-recommend-a-java-profiler for more options and opinions.
Try the Eclipse Memory Analyzer, or any other tool that can process a java heap dump, and then run your app with the flap that generates a heap dump when you run out of memory.
Then analyze the heap dump and look for suspiciously high object counts.
See this article for more information on the heap dump.
EDIT: Also, please note that your app may just legitimately require more memory than you initially thought. You might try increasing the java minimum and maximum memory allocation to something significantly larger first and see if your application runs indefinitely or simply gets slightly further.
The latest version of the Sun JDK includes VisualVM which is essentially the Netbeans profiler by itself. It works really well.
http://www.yourkit.com/download/index.jsp is the only tool you'll need.
You can take snapshots at (1) app start time, and (2) after running app for N amount of time, then comparing the snapshots to see where memory gets allocated. It will also take a snapshot on OutOfMemoryError so you can compare this snapshot with (1).
For instance, the latest project I had to troubleshoot threw OutOfMemoryError exceptions, and after firing up YourKit I realised that most memory were in fact being allocated to some ehcache "LFU " class, the point being that we specified loads of a certain POJO to be cached in memory, but us not specifying enough -Xms and -Xmx (starting- and max- JVM memory allocation).
I've also used Linux's vmstat e.g. some Linux platforms just don't have enough swap enabled, or don't allocate contiguous blocks of memory, and then there's jstat (bundled with JDK).
UPDATE see https://stackoverflow.com/questions/14762/please-recommend-a-java-profiler
You can also add an "UnhandledExceptionHandler" to your Application's Thread. This will catch 'uncaught' exception, like an out of memory error, and you will at least have an idea where the exception was thrown. Usually this not were the problem is but the 'new' that couldn't be satisfied. As a rule I always add the UnhandledExceptionHandler to a Thread if nothing else to add logging.

Categories

Resources