Resource Handling Practice - java

It's assured that Garbage Collector destroys all the unwanted and unused objects,
what if we manually nullify the objects eg. List<String> = null ,
does this action makes any negative or positive performance effect?
I am on Java.
Thanks.

Not an expert on details of memory handling but I can share what I know. GC will collect whatever is not used. Thus when you eliminate the last reference to an object (by explicitly nullifying) you'll be marking it for garbage collection. This does not guarantee that it'll be collected immediately.
You can explicitly try and invoke GC but you'll see lots of people advising against it. My understanding is that the call to GC is unreliable at best. The whole point with GC and Java is that you as a programmer should not need to worry much about the memory allocation. As for performance, unless you have tight limitations for heap space, you shouldn't notice GC activity.

Garbage collection is a way in which Java recollects the space occupied by loitering objects. By doing so, it [Java] ensures that your application never runs out of memory (though we cannot be assured that the program will ever run out of memory).
It is suggested to leave it on JVM.
Read related : Does setting Java objects to null do anything anymore?

Explicit nulling makes little or no difference. Usually the GC can reliably detect when an object can no longer be reached, and can thus be GCd.
Particularly, nulling stack (i.e. inside methods) variables helps absolutely nothing. It's trivial to for the runtime to automatically detect when they will be needed and when not. nulling heap (i.e. inside classes) variables could in some rare instances help, but that's a rare exception, and probably does more harm (in code legibility/maintainability) than good.
Also note that nulling doesn't guarantee if, or when, an object will be GCd.

Related

what to use instead of finalize() in java

Let's consider following code:
class Table {
private static int number_of_Tables=0;
public Table(){
++number_of_Tables;
}
public void finalize(){
--number_of_Tables;
}
public static int current_TableCount(){
return number_of_Tables;
}
}
What I want to achieve is that when Garbage Collector (GC) destroys the Object that the count of available Objects gets decreased by one.
But everyone here on topic of finalize() is saying that using this method is very bad because following could happen: even though there are no references pointing to the object the GC may not destroy it immediately because GC doesn't work around the clock i.e GC will be invoked after certain amount of object are there to be destroyed i.e at certain times GC will perform the cleanup, which means that even though the object is not available anymore my counter wouldn't decrease and I would give the false information upon invoking the method curret_TableCount()
What do people do instead, to solve this kind of a problem with certainty?
There must be some kind of solution in Java?
EDIT: I need to recognize when the object is not referenced anymore i.e during runtime there exists not even one pointer(reference) to the object when this is true, i would then decrese the number of that kind of objects by one.
…following could happen: even though there are no references pointing to the object the GC may not destroy it immediately because GC doesn't work around the clock
That’s correct. The garbage collector’s purpose is to manage memory and only to manage memory. As long as there are no memory needs, the garbage collector doesn’t need to run. It’s perfectly possible that an application runs completely without any gc cycle, when there is sufficient memory.
Further, there is no guaranty that a garbage collector run identifies all unreachable objects. It might stop its work when it identified enough reclaimable memory to allow the application to proceed.
This, however, is not the only issue. Often overlooked, the fact that the garbage collector only cares for memory needs, implies that an object may get collected even when being in use, when its memory is not needed anymore, which is possible with optimized code. This is not a theoretical issue. See for example this bug or that bug related to naive dependency on finalization, even in JDK code.
Note that even if finalize() happens to get invoked at the right time, it’s invoked by an unspecified thread, which requires using thread safe constructs.
What do people do instead, to solve this kind of a problem with certainty?
People usually don’t have that kind of problem. If you truly manage a non-memory resource, you should use an explicit cleanup action, i.e. a method like dispose() or close(), to be invoked after use. The straight-forward approach is to let the class implement AutoClosable (or a subtype of it) and use the try-with-resources statement.
Cleanup actions triggered by the garbage collector are only a last resort for dealing with scenarios where the explicit cleanup has been forgotten. As explained, implementing them needs special care.
In case of a counter maintained only for statistics, you may simply live with the fact that it is imprecise. Normally, you don’t need to know how many instances of a class exist. If you really need it, e.g. when trying to debug a memory leak, you can take a heap dump, a snapshot of all existing objects, and use a dedicated analysis tool.

Can I Force Garbage Collection in Java? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Forcing Garbage Collection in Java?
Can I Force Garbage Collection in Java by any means?
System.gc() is just a suggestion.It's useless.
When I know for sure that some resources won't be used any more,why can't I force to clean them?
Just like delete() in C++ and free() in C?
When there are lots of resources that can't be reused,this can really suck the performance.All that we can do is sleep().
Any solutions?Thanks
Nope, System.gc() is as close as you can get. Java isn't C or C++, the JVM manages memory for you, so you don't have that kind of fine grained control. If you set objects you're no longer using to null, or loose all references, they will get cleaned up. And the GC is pretty smart, so it should take good care of you.
That said, if you are on a unix box, and force a thread dump (kill -3), it'll pretty much force garbage collection.
You shouldn't be trying to force GC - if you are running low on memory then you have a memory leak somewhere. Forcing GC at that point won't help, because if you are holding a reference to the object then it still won't be garbage collected.
What you need to do is solve the real problem, and make sure you are not holding references to objects you are not using any more.
Some common culprits:
Holding lots of references in a large object graph that never get cleared up. Either set references to null when you don't need them any more, or better still simplify your object graph so it doesn't need all the extra long-term references.
Caching objects in a hashmap or something similar that grows huge over time. Stop doing this, or use something like Google's CacheBuilder to create a proper soft reference cache.
Using String.intern() excessively on large numbers of different strings over time.
References with larger scope than they need. Are you using an instance variable when it could be a local variable, for example?
There is no way to explicitly instruct the JVM to collect garbage. This is only performed when the system needs the resources.
The only two actions I'm aware of to potentially get the GC running are the following:
As you stated, attempt to "suggest" that GC now would be a good time by called System.gc().
Set any references you are not using to the null reference to make the elements eligible for collection.
On my second point, see the answer here: Garbage collector in java - set an object null. In essence, if you don't make the objects you don't need available for garbage collection (by losing the reference you have to it) then there is no reason for the garbage collector to run, because it's unaware of any available garbage.
In addition, it's important to consider why/how those objects in memory are affecting performance:
Are you getting lots of OutOfMemoryExceptions? This could be resolved by point #2 and by increasing the available heap space for the JVM.
Have you done measurements to see that more objects in the JVM's allocated heap space makes a difference in performance? Determining when you could let references to objects go earlier could help reduce these issues.

can any unused object escape from Garbage Collector?

Is there any possibility that a object which is not referenced anywhere and still existing on heap. I mean is there a possibility that a unused object getting escaped from garbage collector and be there on the heap until the end of the application.
Wanted to know because if it is there, then while coding i can be more cautious.
If an object is no longer referenced, it does still exist on the heap, but it is also free to be garbage-collected (unless we are talking Class objects, which live in PermGen space and never get garbage-collected - but this is generally not something you need to worry about).
There is no guarantee on how soon that will be, but your application will not run out of memory before memory from those objects is reclaimed.
However, garbage collection does involve overhead, so if you are creating more objects than you need to and can easily create less, then by all means do so.
Edit: in response to your comment, if an object is truly not referenced by anything, it will be reclaimed during garbage collection (assuming you are using the latest JVM from Sun; I can't speak toward other implementations). The reason why is as follows: all objects are allocated contiguously on the heap. When GC is to happen, the JVM follows all references to "mark" objects that it knows are reachable - these objects are then moved into another, clean area. The old area is then considered to be free memory. Anything that cannot be found via a reference cannot be moved. The point is that the GC does not need to "find" the unreferenced objects. If anything, I would be more worried about objects that are still referenced when they are not intended to be, which will cause memory leaks.
You should know that, before a JVM throws an out-of-memory exception, it will have garbage collected everything possible.
If an instance is no longer referenced, it is a possible candidate for garbage collection. This means, that sooner or later it can be removed but there are no guaranties. If you do not run out of of memory, the garbage collector might not even run, thus the instance my be there until the program ends.
The CG system is very good at finding not referenced objects. There is a tiny, tiny chance that you end up keeping a weird mix of references where the garbage collector can not decide for sure if the object is no longer referenced or not. But this would be a bug in the CG system and nothing you should worry about while coding.
It depends on when and how often the object is used. If you allocate something then deallocate (i.e., remove all references to it) it immediately after, it will stay in "new" part of the heap and will probably be knocked out on the next garbage collection run.
If you allocate an object at the beginning of your program and keep it around for a while (if it survives through several garbage collections), it will get promoted to "old" status. Objects in that part of the heap are less likely to be collected later.
If you want to know all the nitty-gitty details, check out some of Sun's gc documentation.
Yes; imagine something like this:
Foo foo = new Foo();
// do some work here
while(1) {};
foo.someOp(); // if this is the only reference to foo,
// it's theoreticaly impossible to reach here, so it
// should be GC-ed, but all GC systems I know of will
// not Gc it
I am using definition of: garbage = object that can never be reached in any execution of the code.
Garbage collection intentionally makes few guarantees about WHEN the objects are collected. If memory never gets too tight, it's entirely possible that an unreferenced object won't be collected by the time the program ends.
The garbage collector will eventually reclaim all unreachable objects. Note the "eventually": this may take some time. You can somewhat force the issue with System.gc() but this is rarely a good idea (if used without discretion, then performance may decrease).
What can happen is that an object is "unused" (as in: the application will not use it anymore) while still being "reachable" (the GC can find a path of references from one of its roots -- static fields, local variables -- to the object). If you are not too messy with your objects and structures then you will not encounter such situations. A rule of thumb would be: if the application seems to take too much RAM, run a profiler on it; if thousands of instances of the same class have accumulated without any apparent reason, then there may be some fishy code somewhere. Correction often involves explicitly setting a field to null to avoid referencing an object for too long.
This is theoretically possible (there is no guarantee the GC will always find all objects), but should not worry you for any real application - it usually does not happen and certainly does not affect a significant chunk of memory.
In theory, the garbage collector will find all unused objects. There could, of course, be bugs in the garbage collector…
That said, "In theory there is no difference between theory and practice, in practice, there is." Under some, mostly older, garbage collectors, if an object definition manages to reach the permanent generation, then it will no longer be garbage collected under any circumstances. This only applied to Class definitions that were loaded, not to regular objects that were granted tenured status.
Correspondingly, if you have a static reference to an object, that takes up space in the "regular" object heap, this could conceivably cause problems, since you only need to hold a reference to the class definition from your class definition, and that static data cannot be garbage collected, even if you don't actually refer to any instances of the class itself.
In practice though, this is a very unlikely event, and you shouldn't need to worry about it. If you are super concerned about performance, then creating lots of "long-lived" objects, that is, those that escape "escape-analysis", will create extra work for the garbage collector. For 99.99% of coders this is a total non-issue though.
My advice - Don't worry about it.
Reason - It is possible for a non-referenced object to stay on the heap for some time, but it is very unlikely to adversely affect you because it is guaranteed to be reclaimed before you get an out of memory error.
In general, all objects to which there are no live hard references, will be garbage-collected. This is what you should assume and code for. However, the exact moment this happens is not predictable.
Just for completeness, two tricky situations [which you are unlikely to run into] come into my mind:
Bugs in JVM or garbage collector code
So called invisible references - they rarely matter but I did have to take them into account one or two times during the last 5 years in a performance-sensitive application I work on

How to cause soft references to be cleared in Java?

I have a cache which has soft references to the cached objects. I am trying to write a functional test for behavior of classes which use the cache specifically for what happens when the cached objects are cleared.
The problem is: I can't seem to reliably get the soft references to be cleared. Simply using up a bunch of memory doesn't do the trick: I get an OutOfMemory before any soft references are cleared.
Is there any way to get Java to more eagerly clear up the soft references?
Found here:
"It is guaranteed though that all
SoftReferences will get cleared before
OutOfMemoryError is thrown, so they
theoretically can't cause an OOME."
So does this mean that the above scenario MUST mean I have a memory leak somewhere with some class holding a hard reference on my cached object?
The problem is: I can't seem to
reliably get the soft references to be
cleared.
This is not unique to SoftReferences. Due to the nature of garbage collection in Java, there is no guarantee that anything that is garbage-collectable will actually be collected at any point in time. Even with a simple bit of code:
Object temp = new Object();
temp = null;
System.gc();
there is no guarantee that the Object instantiated in the first line is garbage collected at this, or in fact any point. It's simply one of the things you have to live with in a memory-managed language, you're giving up declarative power over these things. And yes, that can make it hard to definitively test for memory leaks at times.
That said, as per the Javadocs you quoted, SoftReferences should definitely be cleared before an OutOfMemoryError is thrown (in fact, that's the entire point of them and the only way they differ from the default object references). It would thus sound like there is some sort of memory leak in that you're holding onto harder references to the objects in question.
If you use the -XX:+HeapDumpOnOutOfMemoryError option to the JVM, and then load the heap dump into something like jhat, you should be able to see all the references to your objects and thus see if there are any references beside your soft ones. Alternatively you can achieve the same thing with a profiler while the test is running.
There is also the following JVM parameter for tuning how soft references are handled:
-XX:SoftRefLRUPolicyMSPerMB=<value>
Where 'value' is the number of milliseconds a soft reference will remain for every free Mb of memory. The default is 1s/Mb, so if an object is only soft reachable it will last 1s if only 1Mb of heap space is free.
You can force all SoftReferences to be cleared in your tests with this piece of code.
If you really wanted to, you can call clear() on your SoftReference to clear it.
That said, if the JVM is throwing an OutOfMemoryError and your SoftReference has not been cleared yet, then this means that you must have a hard reference to the object somewhere else. To do otherwise would invalidate the contract of SoftReference. Otherwise, you are never guaranteed that the SoftReference is cleared: as long as there is still memory available, the JVM does not need to clear any SoftReferences. On the other hand, it is allowed to clear them next time it does a GC cycle, even if it doesn't need to.
Also, you can consider looking into WeakReferences since the VM tends to be more aggressive in clear them. Technically, the VM isn't ever required to clear a WeakReference, but it is supposed to clean them up next time it does a GC cycle if the object would otherwise be considered dead. If your are trying to test what happens when your cache is cleared, using WeakReferences should help your entries go away faster.
Also, remember that both of these are dependent on the JVM doing a GC cycle. Unfortunately, there is no way to guarantee that one of those ever happens. Even if you call System.gc(), the garbage collector may decide that it is doing just peachy and choose to do nothing.
In a typical JVM implementation (SUN) you need to trigger a Full GC more than once to get the Softreferences cleaned. The reason for that is because Softreferences require the GC to do more work, because for example of a mechanism that allows you to get notified when the objects are reclaimed.
IMHO using a lot of sofreferences in an application server is evil, because the developer has not much control over when they are released.
Garbage collection and other references like soft references are non deterministic this it's not really possible to reliable do stuff so that soft references are definitely cleared at that point so your test can judge how yourcache reacts. I would suggest you simulate the reference clearing in more definite way by mocking etc - your tests will be reproducable and more valuable rather than just Hopi g for the GC to clean up references. Using the latter approach is a really bad thing to do and willjust introduce additional problems rather than help you improve the quality of your cache and it's collaborating components.
From the documentation and my experience I'd say yes: you must have a reference somewhere else.
I'd suggest using a debugger that can show you all references to an object (such as Eclipse 3.4 when debugging Java 6) and just check when the OOM is thrown.
If you use eclipse, there is this tool named Memory Analyzer that makes heap dump debugging easier.
Does the cached object have a finalizer? The finalizer will create new strong references to the object, so even if the SoftReference is cleared the memory will not be reclaimed until a later GC cycle
If you have a cache which is a Map of SoftReferences and you want them cleared you can just clear() the map and they will all be cleaned up (including their references)

Why do you not explicitly call finalize() or start the garbage collector?

After reading this question, I was reminded of when I was taught Java and told never to call finalize() or run the garbage collector because "it's a big black box that you never need to worry about". Can someone boil the reasoning for this down to a few sentences? I'm sure I could read a technical report from Sun on this matter, but I think a nice, short, simple answer would satisfy my curiosity.
The short answer: Java garbage collection is a very finely tuned tool. System.gc() is a sledge-hammer.
Java's heap is divided into different generations, each of which is collected using a different strategy. If you attach a profiler to a healthy app, you'll see that it very rarely has to run the most expensive kinds of collections because most objects are caught by the faster copying collector in the young generation.
Calling System.gc() directly, while technically not guaranteed to do anything, in practice will trigger an expensive, stop-the-world full heap collection. This is almost always the wrong thing to do. You think you're saving resources, but you're actually wasting them for no good reason, forcing Java to recheck all your live objects “just in case”.
If you are having problems with GC pauses during critical moments, you're better off configuring the JVM to use the concurrent mark/sweep collector, which was designed specifically to minimise time spent paused, than trying to take a sledgehammer to the problem and just breaking it further.
The Sun document you were thinking of is here: Java SE 6 HotSpot™ Virtual Machine Garbage Collection Tuning
(Another thing you might not know: implementing a finalize() method on your object makes garbage collection slower. Firstly, it will take two GC runs to collect the object: one to run finalize() and the next to ensure that the object wasn't resurrected during finalization. Secondly, objects with finalize() methods have to be treated as special cases by the GC because they have to be collected individually, they can't just be thrown away in bulk.)
Don't bother with finalizers.
Switch to incremental garbage collection.
If you want to help the garbage collector, null off references to objects you no longer need. Less path to follow= more explicitly garbage.
Don't forget that (non-static) inner class instances keep references to their parent class instance. So an inner class thread keeps a lot more baggage than you might expect.
In a very related vein, if you're using serialization, and you've serialized temporary objects, you're going to need to clear the serialization caches, by calling ObjectOutputStream.reset() or your process will leak memory and eventually die.
Downside is that non-transient objects are going to get re-serialized.
Serializing temporary result objects can be a bit more messy than you might think!
Consider using soft references. If you don't know what soft references are, have a read of the javadoc for java.lang.ref.SoftReference
Steer clear of Phantom references and Weak references unless you really get excitable.
Finally, if you really can't tolerate the GC use Realtime Java.
No, I'm not joking.
The reference implementation is free to download and Peter Dibbles book from SUN is really good reading.
As far as finalizers go:
They are virtually useless. They aren't guaranteed to be called in a timely fashion, or indeed, at all (if the GC never runs, neither will any finalizers). This means you generally shouldn't rely on them.
Finalizers are not guaranteed to be idempotent. The garbage collector takes great care to guarantee that it will never call finalize() more than once on the same object. With well-written objects, it won't matter, but with poorly written objects, calling finalize multiple times can cause problems (e.g. double release of a native resource ... crash).
Every object that has a finalize() method should also provide a close() (or similar) method. This is the function you should be calling. e.g., FileInputStream.close(). There's no reason to be calling finalize() when you have a more appropriate method that is intended to be called by you.
Assuming finalizers are similar to their .NET namesake then you only really need to call these when you have resources such as file handles that can leak. Most of the time your objects don't have these references so they don't need to be called.
It's bad to try to collect the garbage because it's not really your garbage. You have told the VM to allocate some memory when you created objects, and the garbage collector is hiding information about those objects. Internally the GC is performing optimisations on the memory allocations it makes. When you manually try to collect the garbage you have no knowledge about what the GC wants to hold onto and get rid of, you are just forcing it's hand. As a result you mess up internal calculations.
If you knew more about what the GC was holding internally then you might be able to make more informed decisions, but then you've missed the benefits of GC.
The real problem with closing OS handles in finalize is that the finalize are executed in no guaranteed order. But if you have handles to the things that block (think e.g. sockets) potentially your code can get into deadlock situation (not trivial at all).
So I'm for explicitly closing handles in a predictable orderly manner. Basically code for dealing with resources should follow the pattern:
SomeStream s = null;
...
try{
s = openStream();
....
s.io();
...
} finally {
if (s != null) {
s.close();
s = null;
}
}
It gets even more complicated if you write your own classes that work via JNI and open handles. You need to make sure handles are closed (released) and that it will happen only once. Frequently overlooked OS handle in Desktop J2SE is Graphics[2D]. Even BufferedImage.getGrpahics() can potentially return you the handle that points into a video driver (actually holding the resource on GPU). If you won't release it yourself and leave it garbage collector to do the work - you may find strange OutOfMemory and alike situation when you ran out of video card mapped bitmaps but still have plenty of memory. In my experience it happens rather frequently in tight loops working with graphics objects (extracting thumbnails, scaling, sharpening you name it).
Basically GC does not take care of programmers responsibility of correct resource management. It only takes care of memory and nothing else. The Stream.finalize calling close() IMHO would be better implemented throwing exception new RuntimeError("garbage collecting the stream that is still open"). It will save hours and days of debugging and cleaning code after the sloppy amateurs left the ends lose.
Happy coding.
Peace.
The GC does a lot of optimization on when to properly finalize things.
So unless you're familiar with how the GC actually works and how it tags generations, manually calling finalize or start GC'ing will probably hurt performance than help.
Avoid finalizers. There is no guarantee that they will be called in a timely fashion. It could take quite a long time before the Memory Management system (i.e., the garbage collector) decides to collect an object with a finalizer.
Many people use finalizers to do things like close socket connections or delete temporary files. By doing so you make your application behaviour unpredictable and tied to when the JVM is going to GC your object. This can lead to "out of memory" scenarios, not due to the Java Heap being exhausted, but rather due to the system running out of handles for a particular resource.
One other thing to keep in mind is that introducing the calls to System.gc() or such hammers may show good results in your environment, but they won't necessarily translate to other systems. Not everyone runs the same JVM, there are many, SUN, IBM J9, BEA JRockit, Harmony, OpenJDK, etc... This JVM all conform to the JCK (those that have been officially tested that is), but have a lot of freedom when it comes to making things fast. GC is one of those areas that everyone invests in heavily. Using a hammer will often times destroy that effort.

Categories

Resources