Java Swing memory usage - java

There are plenty of questions on this issue, but no answer satisfied me; I'm coding a simple GUI window using Swing. Currently it only contains 4 buttons and one of them is opening a file chooser.
When I use VisualVM Monitoring tool to look at memory usage, the result is shown below:
memory usage http://img17.imageshack.us/img17/3589/8txc.png
The first pike appeared when I clicked the button, then I did nothing else.
Is this normal for an idle application to consume 10M/min (when it is not doing anything?)
Since I have to do a quizz-like applet with an image, should I use to call System.gc() each time I switch to the next question to avoid (potentially huge?) memory usage?

I could recreate what you saw with a simple test program, but if you let the monitor track a little longer...
CPU was at 0% the whole time (just an open JFileChooser sitting there). The memory fluctuations are curious, but I think the JVM must be doing all kinds of background management and maintenance tasks. That work will use memory which eventually builds up. And periodically it cleans up after itself.
Another thing to consider is that is this memory usage within the already allocated heap. Your program is actually using a consistent amount of real memory the entire time, as you can see from the flat orange line.

I think the problem is at the JFileChooser. When I used it, I noticed, that the fileChooser is "waiting" for something. And when I closed the program incorrectly, the fileChooser throws an Error. Maybe this will help you :)

Related

Memory build up in a java program GapContent$MarkData

I am currently developing an application that need to be very fast executing about every 20ms (yeah I know, should not have taken Java in the first place). I worked a lot optimizing the code so it would not be too computation greedy. However, as I have seen I may have not put enough effort in GUI and memory optimization. My application can run at the speed I want but after 1-2 minutes it drastically slowdown suggesting a memory problem.
I did run the profiler under NetBeans and found out that most of the memory was taken by the javax.swing.text.GapContent$MarkData
And searched on google, I saw mostly nothing understandable helping me with that problem. So is there anyone that could help me? My first guess would be that the garbage collector doesn't run long enough to erase unused object...but I don't have more clue than that.
You are right to employ profiling; now use Profile > Profile Project > CPU to find and target the hot spot(s).
The slowdown was due to a function that closed and opened a connection with the database each iteration.
Consider using SwingWorker to query the database in the background and process() results on the event dispatch thread, as shown in this related example.
What you are calling a "Memory build up" is only 600Kb. If this 600Kb is problematic I question your choice of Java and Swing.
I have an application that sometimes generates hundreds of megabytes of log messages.
I'm guessing your GUI application is somewhat similar. The app probably has a JTextPane that displays a log. As the app runs it adds messages to the JTextPane.
The Document implementation used by JTextPane is a PlainDocument.
Even though you probably always insert new log messages at either only the top or only the bottom, the PlainDocument implementation is general-purpose. It supports modification anywhere in the document by putting a gap in the underlying stream of text and then putting the changes into the gap. As the app inserts new messages into the Document it creates lots and lots of Gaps.
The actual text to display has to exist somewhere. There is probably a better way to implement a huge text pane, but the default JTextPane will look, to the profiler, like a memory leak. If you have 600kb of log messages, its going to take at least 600kb of memory somewhere.
You should know that the Java Console uses a PlainDocument with GapContent$MarkData and just having the console open with lots of data in it will cause this "memory leak" to appear. Clear the console to see the number of MarkData drop back to acceptable levels.

restart java process on heap dump

I need to restart a java process if it produce any memory issues like 'GC overhead limit exceeded' or 'Java heap space'.
Is there some standard way of doing this like using some tool or options.
If not how can i put up a watchDog for doing this.
I noticed that my process is not going down when these issues happens.
And a restart brings it back to its foot again
There are people here who will suggest better options, so this is just my 0.02$. What I did a while ago on some app, is have a SoftReference to an Object, and once in a while I would check if that Object is null. SoftReferences are being collected (usually, but not guaranteed) by GC right before you get really close to OutOfMemory, so that would somehow tell you that you are really close to failing.
Also, in this case you should be looking at the JVM option:
-XX:SoftRefLRUPolicyMSPerMB=someValue
Where 'someValue' is the number of milliseconds a soft reference will remain for every free Mb of memory. The default is 1s/Mb, so if an object is only soft reachable it will last 1s if only 1Mb of heap space is free
It is probably not the best option, but just a hint may be?
Cheers, Eugene.
Runtime#freeMemory() will tell you how much memory is available within the VM - you could monitor that and raise an alarm when it reaches a threshold. Calling System.gc() at that point may free some more memory, but it isn't guarranteed and should be seen as a last resort.
You really need to combine this with understanding why you are running out of memoery and trying to do something to fix it.
You could use the Tanuki Software Java Service Wrapper; it will handle Automatic customizable response when something happens in your application or JVM.
It has a filter feature that will:
Filters are a very powerful feature which makes it possible to add new behavior to existing applications without any coding. It works by monitoring the console output of a JVM for sequences of text. When they are found, any number of actions can then be taken.
Examples are initiating a JVM restart whenever a specific error occurs. Some applications have known bugs where they stop working once getting into a certain state. This feature makes it possible to work around such problems immediately until they can be resolved in the application.
Assuming your Java application returns 0 upon graceful shutdown, a below shell script can serve the role of a watchdog.
#!/bin/bash
...
while true; do
java ... MyClass && break
done

Large CPU usage on a java process under Linux

I am running into trouble to determine what is wrong with my software.
The situation is;
-The program is always running on background and every X minutes performs some actions.
-Right now it is set to check every 1 minute a certain directory and see if there are new files in it.
-If there are new files, they are processed and moved somewhere else.
-If not, it simply logs the event and goes idle again.
I Assume that when new files appear, CPU usage can be somewhat high.
The problem comes when, even if I dont put new files in the directory for many days, the CPU usage will raise to ~90% every minute it checks for new entrys, then after some seconds, return back to <1% usage.
The same process under windows seems somehow stable, staying always on low cpu usage.
If I monitor the CPU activty monthly, I can see that the average CPU usage for my java process keeps growing up (without putting new files to 'activate' the rest of the process), and I have to restart the process for it to return to lower CPU usage levels.
I really dont happen to understand this behaviour, so I dont really know what may be affecting this.
If the log file is somewhat 'big', like 10-20mb would it require that much cpu to log a new entry every minute?
If there are many libraries loaded in the classpath for this process, will the cpu usage be increased even though many of this libraries wont be used most all the time?
Excuse me if I haven't been very clear on my question, I am somewhat new to this.
Thanks every one in advance, regards.
--edit--
I note your advices, I will do some monitoring and I will post some code / results to share with you and see what can you come up with!
I am really lost right now!
I your custom monitoring code is causing a problem, you could always use something standard like Apache Commons IO's FileAlterationMonitor. It's simple to implement and it might be faster than fixing your current code.
Are you talking about a simple console application or a swing/awt app ?
Is the application run every minute via OS underlying at schedule or it's a simple server process ?
If the process is run as a server how do you launch the VM ? (server VM or client VM - -server switch on cmd line)
You may check also your garbage collector, sometimes logging framework use up too many object without releasing their references.
Regards
M.

Java application memory usage

I have been writing a small java application (my first!), that does only a few things at the moment. Currently, it runs the Main class which launches a gui class (a class I wrote that extends JFrame that only contains a JTextArea), a class that loads a local file through a BufferedInputStream that is approximately 40kb, and class that loads a entry from a Java properties file.
Everything works wonderfully, however, I was watching the Windows task manager and I noticed something that struck me as odd. When I launch the application, the RAM usage jumps to about 40MB while it loads the local file and pulls a few values from it to display in the JTextArea, which seems normal to me because of the JVM, Java base classes, etc. At this point, however, when the application has finished loading the file, itmerely sits idle, as I currently don't have it doing anything else. While it is sitting idle, as long as the window is active, the application's memory usage starts climbing by 10-20kb every second. This strikes me as odd. If I click on another program to make this one the inactive window, the memory still rises, but at a much slower rate (about 10kb every 3-5 seconds).
I have not tested to see how far it would go up, but this strikes me as very odd behavior. Is this normal Java behavior? I guess it is possible that my code could be leaking memory, but I'm not sure how. I did make sure to close the BufferedInputStream I am using, and I can't see what else would cause this.
I'm sorry if my explanation doesn't make sense, but I would appreciate any insight and/or pointers anyone may have.
UPDATE:
Upon suggestion, I basically stripped my application down to the Main class which simply calls the gui class. The gui class only extends JFrame and sets the window size, close operation, and visible properties. With these changes, the memory still grows at 10-20kb, but at a slower rate. This, in conjuction with other advice I have received leads me to believe that this is just Java. I will continue to play with it and let you all know if I find out anything else interesting.
Try monitoring the heap usage with jconsole instead of the Windows task manager:
Launch your app with the -Dcom.sun.management.jmxremote option e.g.
java -Dcom.sun.management.jmxremote -jar myapp.jar
Launch jconsole from the command line, and connect to the local pid of the java process you started in the last step.
Click over to memory and watch heap memory (the default display)
If you watch for a while, you'll probably get a "sawtooth" pattern as the memory climbs over time, but then has sharp drop-offs when the garbage collector runs. You can try to "suggest" garbage collection by clicking the so-labelled button.
When you do this, does the memory usage drop down to the same minimum level, or is the overall minimum increasing over the course of several minutes? If the minimum usage increases, then you have a memory leak. If it always returns to the same minimum level, then you're fine.
Congrats on your first app! Now, a couple things to think about. First, the Windows task manager is not a great resource to understand how quickly your vm is growing. Instead, you should monitor your garbage collection stats in the console (use the -verbose:gc commandline param). Second, if you are concerned about potential leaks and the growth of the vm, there are a bunch of great profilers out there that are easy to use and can help you diagnose memory issues. check out these two posts for some profiler options.
Congratulations for your first Java app!
Java applications run in a virtual machine. The virtual machine has been assigned a fixed amount of memory by the OS, typically 512 MB. As long as the application uses less than 512 MB the garbage collector won't kick in and start searching for "dead" memory blocks. The JVM memory limit can be modified in most OSes. Try switching the memory limit to 32 MB, for example.
Is this normal Java behavior?
No.
I guess it is possible that my code could be leaking memory
That is definitely the cause. Please post your source code, otherwise further diagnosis isn't possible.
I noticed you are using Swing, make sure you are launching your JFrame in the event dispatch thread, using the invokeLater(Runnable) method.
If your are using any sort of collections, make sure you clear them once done.
Since you are doing some file IO, make sure you close all of the classes involved in in the IO operations after you are done with them.
If you are using any event listeners, remember to explicitly remove event listeners when they are no longer necessary.
One thing you could try is experimenting. Take your application and remove the file IO, see what happens. Does the memory usage still climb as before? Now resotre your application to normal, and remove the text area - does the memory still climb as before? Etc, etc. This will help you to determine what the source is, and you can focus your efforts there. Most likely you will uncover what you are after by doing this.
Another useful diagnosis tool is to use System.gc() at particular points in time, usually after the heavy-lifting blocks of code. This will tell the JVM to perform a garbage collection at that point in the execution, rather than at another time determined by memory consumption. This will help you to take into account any periodic fluctuations in the memory usage of your application.
Failing which, you can always use a memory profiler. If you are using Netbeans IDE, there's one built right into it. For Eclipse, there're several plugins which can perform profiling.
it is normal. some background calc might leave dead objects around, which JVM isn't in a hurry to clean up. eventually they will be garbage collected, when max mem is approached.
leave your program running overnight, and your machine won't blow up.

How to detect Out Of Memory condition?

I have an application running on Websphere Application Server 6.0 and it crashes nearly every day because of Out-Of-Memory. From verbose GC is certain there are the memory leaks(many of them)
Unfortunately the application is provided by external vendor and getting things fixed is slow & painful process. As part of the process I need to gather the logs and heapdumps each time the OOM occurs.
Now I'm looking for some way how to automate it. Fundamental problem is how to detect OOM condition. One way would be to create shell script which will periodically search for new heapdumps. This approach seems me a kinda dirty. Another approach might be to leverage the JMX somehow. But I have little or no experience in this area and don't have much idea how to do it.
Or is in WAS some kind of trigger/hooks for this? Thank you very much for every advice!
You can pass the following arguments to the JVM on startup and a heap dump will be automatically generated on an OutOfMemoryError. The second argument lets you specify the path for the heap dump file. By using this at least you could check for the existence of a specific file to see if a heap dump has occurred.
-XX:+HeapDumpOnOutOfMemoryError
-XX:HeapDumpPath=<value>
I see two options if you want heap dumping automated but #Mark's solution with heap dump on OOM isn't satisfactory.
You can use the MemoryMXBean to detect high memory pressure, and then programmatically create a heap dump if the usage (or usage delta) seems high.
You can periodically get memory usage info and generate heap dumps with a cron'd shell script using jmap (works both locally and remote).
It would be nice if you could have a callback on OOM, but, uhm, that callback probably would just crash with an OOM error. :)
Have you looked at JConsole ? It uses JMX to give you visibility of a variety of JVM metrics, including memory info. It would probably be worth monitoring your application using this to begin with, to get a feel for how/when the memory is consumed. You may find the memory is consumed uniformly over the day, or when using certain features.
Take a look at the detecting low memory section of the above link.
If you need you can then write a JMX client to watch the application automatically and trigger whatever actions required. JConsole will indicate which JMX methods you need to poll.
And alternative to waiting until the application has crashed may be to script a controlled restart like every night if you're optimistic that it can survive for twelve hours..
Maybe even websphere can do that for you !?
You could add a listener (Session scoped or Application scope attribute listener) class that would be called each time a new object is added in session/app scope.
In this - you can attempt to check the total memory used by app (Log it) as as call run gc (note that invoking it will not imply gc will always run)
(The above is for the logging part and gc based on usage growth)
For scheduled gc:
In addition you can keep a timer task class that runs after every few hrs and does a request for gc.
Our experience with ITCAM has been less than stellar from the monitoring perspective. We dumped it in favor of CA Wily Introscope.
Have you had a look on the jvisualvm tool in the latest Java 6 JDK's?
It is great for inspecting running code.
I'd dispute that the you need the heap dumps when the OOM occurs. Periodic gathering of the information over time should give the picture of what's going on.
As has been observed various tools exist for analysing these problems. I have had success with ITCAM for WebSphere, as an IBMer I have ready access to that. We were very quickly able to indentify the exact lines of code in out problem situation.
If there's any way you can get a tool of that nature then that's the way to go.
It should be possible to write a simple program to get the process list from the kernel and scan it to see if your WAS process is still running. On a Unix box you could probably whip up something in Perl in a few minutes (if you know Perl), not sure how difficult it would be under Windows. Run it as a scheduled task every five minutes or so, and if the process doesn't show up you could have it fork off another process that would deal with the heap dump and re-start WAS.

Categories

Resources