WeakHashMap Randomly clears - java

I am running a game, when I start up I load images into a WeakHashMap of Images. When I run my game, my RAM just keeps going up, then eventually my WeakHashMap just unload all of their data. Is this relates to Garbage Collection? Any solutions?

You can create a HashMap using SoftReferences instead of WeakReferences - the garbage collector will be a bit less eager about GCing it. Just copy the WeakHashMap source code, replacing the WeakReferences with SoftReferences.

As Louis Wasserman suggested this is expected behaviour. I think you may want a normal hashmap. Please read the docs regarding WeakHashMap at http://docs.oracle.com/javase/6/docs/api/java/util/WeakHashMap.html

A java.util.WeakHashMap is a type of map that, as its description might suggest, keeps only weak references to its keys. Weak references, as you know, are references that do not prevent the garbage collector from collecting the referenced objects. In order to prevent an object from being garbage collected, you must maintain a strong reference to the object somewhere.
If you want the data to be protected from garbage collection, store it in a regular HashMap. For your particular application, you may want to write your own map implementation that keeps soft references (references that the gc only clears if it has to, rather than always clearing) to the images, and have it automatically load missing art when that art is called for. (Could be tricky if you need it to be all thread safe, though...)

Related

OutOfMemoryError:GC Overhead limit exceeded

In one of our java application we have got
OutOfMemoryError:GC Overhead limit exceeded.
We have used HashMaps in someplaces for storing some data.From logs we can I identify that its reproducing at the same place.
I wanted to ask if Garbage Collector spends more time in clearing up the hashmaps?
Upon looking at the code( i cant share here ), I have found that that there is a Hashmap created like
Hashmap topo = new HashMap();
but this hashmap is never used.
Is this a kind of memory leak in my application ?
If this Hashmap is created inside a method which is doing some processing and it is not used elsewhere also this method is accessed my multiple threads say 20 .Then in such a case would it impact,creating Hashmap as above, Garbage collector to spend more time in recovering heap and throw OOME.
Please let me know if you need some more details.
n one of our java application we have got OutOfMemoryError:GC Overhead
limit exceeded. We have used HashMaps in someplaces for storing some
data.From logs we can I identify that its reproducing at the same
place.
If the Hashmap is simply ever building and most likely marked as static, which means you keep adding things to this hashmap and never delete. Then one fine day it will lead to OutOfMemoryError.
I wanted to ask if Garbage Collector spends more time in clearing up
the hashmaps?
Garbage collector spends time on the objects which are not referenced, weakly referenced, soft referenced. Wherever it find such objects, depending on the need it will clear them.
Upon looking at the code( i cant share here ), I have found that that there is a Hashmap
created like Hashmap topo = new HashMap(); , but this hashmap is never used. Is this a
kind of memory leak in my application ?
if this Hashmap is created inside a method which is doing some
processing and it is not used elsewhere also this method is accessed
my multiple threads say 20 . Then in such a case would it
impact,creating Hashmap as above, Garbage collector to spend more time
in recovering heap and throw OOME.
If it is hashmap local to a methid, and the method exits after doing some processing, then it should be garbage collected as soon as method exits. As the hashmap is local to the method, so each thread will have a separate copy of this map and once thread finishes the method execution, map is eligible for GC.
You need to look for long-lifetime objects & structures, which might be the actual problem, rather than wildly grasping at some clueless manager's idea of potential problem.
See:
How to find memory leaks using visualvm
How to find a Java Memory Leak
Look out especially for static/ or application-lifetime Maps or Lists, which are added to during the lifetime rather than just at initialization. It will most likely be one, or several, of these that are accumulating.
Note also that inner classes (Listeners, Observers) can capture references to their containing scope & prevent these from being GC'ed indefinitely.
Please let me know if you need some more details.
You need some more details. You need to profile your application to see what objects are consuming the heap space.
Then, if some of the sizeable objects are no longer actually being used by your application, you have a memory leak. Look at the references to these objects to find out why they're still being held in memory when they're no longer useful, and then modify your code to no longer hold these references.
Alternatively, you may find that all of the objects in memory are what you would expect as your working set. Then either you need to increase the heap size, or refactor your application to work with a smaller working set (e.g. streaming events one at a time rather than reading an entire list; storing the last seesion details in the database rather than memory; etc.).

Can I Force Garbage Collection in Java? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Forcing Garbage Collection in Java?
Can I Force Garbage Collection in Java by any means?
System.gc() is just a suggestion.It's useless.
When I know for sure that some resources won't be used any more,why can't I force to clean them?
Just like delete() in C++ and free() in C?
When there are lots of resources that can't be reused,this can really suck the performance.All that we can do is sleep().
Any solutions?Thanks
Nope, System.gc() is as close as you can get. Java isn't C or C++, the JVM manages memory for you, so you don't have that kind of fine grained control. If you set objects you're no longer using to null, or loose all references, they will get cleaned up. And the GC is pretty smart, so it should take good care of you.
That said, if you are on a unix box, and force a thread dump (kill -3), it'll pretty much force garbage collection.
You shouldn't be trying to force GC - if you are running low on memory then you have a memory leak somewhere. Forcing GC at that point won't help, because if you are holding a reference to the object then it still won't be garbage collected.
What you need to do is solve the real problem, and make sure you are not holding references to objects you are not using any more.
Some common culprits:
Holding lots of references in a large object graph that never get cleared up. Either set references to null when you don't need them any more, or better still simplify your object graph so it doesn't need all the extra long-term references.
Caching objects in a hashmap or something similar that grows huge over time. Stop doing this, or use something like Google's CacheBuilder to create a proper soft reference cache.
Using String.intern() excessively on large numbers of different strings over time.
References with larger scope than they need. Are you using an instance variable when it could be a local variable, for example?
There is no way to explicitly instruct the JVM to collect garbage. This is only performed when the system needs the resources.
The only two actions I'm aware of to potentially get the GC running are the following:
As you stated, attempt to "suggest" that GC now would be a good time by called System.gc().
Set any references you are not using to the null reference to make the elements eligible for collection.
On my second point, see the answer here: Garbage collector in java - set an object null. In essence, if you don't make the objects you don't need available for garbage collection (by losing the reference you have to it) then there is no reason for the garbage collector to run, because it's unaware of any available garbage.
In addition, it's important to consider why/how those objects in memory are affecting performance:
Are you getting lots of OutOfMemoryExceptions? This could be resolved by point #2 and by increasing the available heap space for the JVM.
Have you done measurements to see that more objects in the JVM's allocated heap space makes a difference in performance? Determining when you could let references to objects go earlier could help reduce these issues.

Avoiding multiple Garbage Collection execution

Read in some blog that GC in Android happens on main(UI) thread, this may create sluggishness in UI screen depending on the frequency of GC execution.
Hence I was wondering will it be a good idea if I manually release objects(by assigning null value) which has no further use for me.
This way we may avoid multiple execution of GC in the application.
Please share your thoughts.
Thanks,
sku
There's no such thing as "manually releasing objects" -- at least not in any way that's meaningful to GC. An object doesn't immediately get freed/collected/whatever when you lose all references to it; it just becomes eligible for collection. GC is what actually does the releasing of the object, and it does so when it feels like doing so.
The only real way to keep the GC from working so hard is to create fewer objects, particularly temporary objects. Less garbage == less collection.
Releasing (dereferencing) objects for which you have no further use is always a good idea. You can also use SoftReference, WeakReference and/or WeakHashMap to help the GC pick up stuff that you don't mind going away if the system needs space.
There's more information about Android's GC system here.

How to cause soft references to be cleared in Java?

I have a cache which has soft references to the cached objects. I am trying to write a functional test for behavior of classes which use the cache specifically for what happens when the cached objects are cleared.
The problem is: I can't seem to reliably get the soft references to be cleared. Simply using up a bunch of memory doesn't do the trick: I get an OutOfMemory before any soft references are cleared.
Is there any way to get Java to more eagerly clear up the soft references?
Found here:
"It is guaranteed though that all
SoftReferences will get cleared before
OutOfMemoryError is thrown, so they
theoretically can't cause an OOME."
So does this mean that the above scenario MUST mean I have a memory leak somewhere with some class holding a hard reference on my cached object?
The problem is: I can't seem to
reliably get the soft references to be
cleared.
This is not unique to SoftReferences. Due to the nature of garbage collection in Java, there is no guarantee that anything that is garbage-collectable will actually be collected at any point in time. Even with a simple bit of code:
Object temp = new Object();
temp = null;
System.gc();
there is no guarantee that the Object instantiated in the first line is garbage collected at this, or in fact any point. It's simply one of the things you have to live with in a memory-managed language, you're giving up declarative power over these things. And yes, that can make it hard to definitively test for memory leaks at times.
That said, as per the Javadocs you quoted, SoftReferences should definitely be cleared before an OutOfMemoryError is thrown (in fact, that's the entire point of them and the only way they differ from the default object references). It would thus sound like there is some sort of memory leak in that you're holding onto harder references to the objects in question.
If you use the -XX:+HeapDumpOnOutOfMemoryError option to the JVM, and then load the heap dump into something like jhat, you should be able to see all the references to your objects and thus see if there are any references beside your soft ones. Alternatively you can achieve the same thing with a profiler while the test is running.
There is also the following JVM parameter for tuning how soft references are handled:
-XX:SoftRefLRUPolicyMSPerMB=<value>
Where 'value' is the number of milliseconds a soft reference will remain for every free Mb of memory. The default is 1s/Mb, so if an object is only soft reachable it will last 1s if only 1Mb of heap space is free.
You can force all SoftReferences to be cleared in your tests with this piece of code.
If you really wanted to, you can call clear() on your SoftReference to clear it.
That said, if the JVM is throwing an OutOfMemoryError and your SoftReference has not been cleared yet, then this means that you must have a hard reference to the object somewhere else. To do otherwise would invalidate the contract of SoftReference. Otherwise, you are never guaranteed that the SoftReference is cleared: as long as there is still memory available, the JVM does not need to clear any SoftReferences. On the other hand, it is allowed to clear them next time it does a GC cycle, even if it doesn't need to.
Also, you can consider looking into WeakReferences since the VM tends to be more aggressive in clear them. Technically, the VM isn't ever required to clear a WeakReference, but it is supposed to clean them up next time it does a GC cycle if the object would otherwise be considered dead. If your are trying to test what happens when your cache is cleared, using WeakReferences should help your entries go away faster.
Also, remember that both of these are dependent on the JVM doing a GC cycle. Unfortunately, there is no way to guarantee that one of those ever happens. Even if you call System.gc(), the garbage collector may decide that it is doing just peachy and choose to do nothing.
In a typical JVM implementation (SUN) you need to trigger a Full GC more than once to get the Softreferences cleaned. The reason for that is because Softreferences require the GC to do more work, because for example of a mechanism that allows you to get notified when the objects are reclaimed.
IMHO using a lot of sofreferences in an application server is evil, because the developer has not much control over when they are released.
Garbage collection and other references like soft references are non deterministic this it's not really possible to reliable do stuff so that soft references are definitely cleared at that point so your test can judge how yourcache reacts. I would suggest you simulate the reference clearing in more definite way by mocking etc - your tests will be reproducable and more valuable rather than just Hopi g for the GC to clean up references. Using the latter approach is a really bad thing to do and willjust introduce additional problems rather than help you improve the quality of your cache and it's collaborating components.
From the documentation and my experience I'd say yes: you must have a reference somewhere else.
I'd suggest using a debugger that can show you all references to an object (such as Eclipse 3.4 when debugging Java 6) and just check when the OOM is thrown.
If you use eclipse, there is this tool named Memory Analyzer that makes heap dump debugging easier.
Does the cached object have a finalizer? The finalizer will create new strong references to the object, so even if the SoftReference is cleared the memory will not be reclaimed until a later GC cycle
If you have a cache which is a Map of SoftReferences and you want them cleared you can just clear() the map and they will all be cleaned up (including their references)

Removing objects from Java Collections

I have a HashMap (although I guess this question applies to other collections) of objects. From what I understand, when the documentation talks about removing mappings, then it is removing the entry from the hashtable, i.e. not necessarily destroying the actual object. If the only remaining reference to the object is in this table, then will the object get garbage collected?
If I do map.clear() and those objects that were in the table are not referenced anywhere else, will they get garbage collected?
What is the fastest way, to actually remove all entries from the table, but also destroy those objects.
Yes, if the collection is the last place these objects are referenced they are eligible for garbage collection after they have been removed from the collection. And no, you can not destroy these objects forcefully. The garbage collector will handle them when it feels like it.
If the only remaining reference to the object is in this table, then will the object get garbage collected?
If there are no other references to an object, then the object will be garbage collected sometime in the future.
You should not have to force destruction of the objects. If they are extremely heavyweight objects (or you have too many objects to fit in memory), this points to a more fundamental problem with your code.
If you really must, then you can call System.gc(), although this is not good practice, and will always be a bellwether of underlying problems in your code.
Generally speaking, you have no strong control over when an object is specifically destroyed. Any object is eligible for garbage collection when there are no more (strong) reference to it - but there are no guarantees about when it will be garbage collected or in fact if it ever will be. Even calling System.gc() or Runtime.gc() provides no guarantees about actually doing anything, it's merely a hint to the JVM that it might want to consider garbage collecting now. I believe the only guarantee you get is that if an OutOfMemoryError is thrown, all potential garbage collections were done before the error was thrown.
There are implications here for handling sensitive information such as passwords. Since Strings cannot be programatically cleared, you ideally don't want to store the password as such. If you instead store it as an array of characters, you can then use Arrays.fill(' ') to overwrite the password and guarantee it is no longer resident in memory from that point.
Back back on topic - you are right that both operations will make the object eligible for garbage collection if it is not being referenced elsewhere. Collection.clear() is indeed the fastest way to drop references to all the objects in a collection at once.
Note that WeakHashMap allows you to place objects in it and have them eligible for garbage collection as soon as there are no more references to the key (not the value) outside the map - the map entry will disappear at this point.
In general you should not worry about when objects are garbage collected - the JVM decides this, and it knows a lot more about its memory needs and possible delays than you. What you should worry about is to make sure that objects you don't need anymore are eligible for garbage collection.
To have something truly garbage collected there can be no Strong references to the object. Objects with weakReference's may be garbage collected. Use WeakHashMap to make sure they are garbage collected, as that in a HashMap there is still references to the object.
You can initiate a call to System.gc() after you clearyour map, but it's generally not a good idea.

Categories

Resources