Memory build up in a java program GapContent$MarkData - java

I am currently developing an application that need to be very fast executing about every 20ms (yeah I know, should not have taken Java in the first place). I worked a lot optimizing the code so it would not be too computation greedy. However, as I have seen I may have not put enough effort in GUI and memory optimization. My application can run at the speed I want but after 1-2 minutes it drastically slowdown suggesting a memory problem.
I did run the profiler under NetBeans and found out that most of the memory was taken by the javax.swing.text.GapContent$MarkData
And searched on google, I saw mostly nothing understandable helping me with that problem. So is there anyone that could help me? My first guess would be that the garbage collector doesn't run long enough to erase unused object...but I don't have more clue than that.

You are right to employ profiling; now use Profile > Profile Project > CPU to find and target the hot spot(s).
The slowdown was due to a function that closed and opened a connection with the database each iteration.
Consider using SwingWorker to query the database in the background and process() results on the event dispatch thread, as shown in this related example.

What you are calling a "Memory build up" is only 600Kb. If this 600Kb is problematic I question your choice of Java and Swing.
I have an application that sometimes generates hundreds of megabytes of log messages.
I'm guessing your GUI application is somewhat similar. The app probably has a JTextPane that displays a log. As the app runs it adds messages to the JTextPane.
The Document implementation used by JTextPane is a PlainDocument.
Even though you probably always insert new log messages at either only the top or only the bottom, the PlainDocument implementation is general-purpose. It supports modification anywhere in the document by putting a gap in the underlying stream of text and then putting the changes into the gap. As the app inserts new messages into the Document it creates lots and lots of Gaps.
The actual text to display has to exist somewhere. There is probably a better way to implement a huge text pane, but the default JTextPane will look, to the profiler, like a memory leak. If you have 600kb of log messages, its going to take at least 600kb of memory somewhere.

You should know that the Java Console uses a PlainDocument with GapContent$MarkData and just having the console open with lots of data in it will cause this "memory leak" to appear. Clear the console to see the number of MarkData drop back to acceptable levels.

Related

Java Swing memory usage

There are plenty of questions on this issue, but no answer satisfied me; I'm coding a simple GUI window using Swing. Currently it only contains 4 buttons and one of them is opening a file chooser.
When I use VisualVM Monitoring tool to look at memory usage, the result is shown below:
memory usage http://img17.imageshack.us/img17/3589/8txc.png
The first pike appeared when I clicked the button, then I did nothing else.
Is this normal for an idle application to consume 10M/min (when it is not doing anything?)
Since I have to do a quizz-like applet with an image, should I use to call System.gc() each time I switch to the next question to avoid (potentially huge?) memory usage?
I could recreate what you saw with a simple test program, but if you let the monitor track a little longer...
CPU was at 0% the whole time (just an open JFileChooser sitting there). The memory fluctuations are curious, but I think the JVM must be doing all kinds of background management and maintenance tasks. That work will use memory which eventually builds up. And periodically it cleans up after itself.
Another thing to consider is that is this memory usage within the already allocated heap. Your program is actually using a consistent amount of real memory the entire time, as you can see from the flat orange line.
I think the problem is at the JFileChooser. When I used it, I noticed, that the fileChooser is "waiting" for something. And when I closed the program incorrectly, the fileChooser throws an Error. Maybe this will help you :)

java first encounter with heap space error server data logger

I built my first Java program which is built on top of the Interactive Brokers Java API. That may or may not be important. I just extended the main API classes with a couple new classes.
The program is making data queries to a remote server. When the server responds, I log the received data to a local MySQL data base. Once the program finishes logging the data, the program will make the next data request.
I am having a problem after leaving the program running for some time, after making a couple hundred server requests. I will see this error, then the program doesn't continue to execute:
java.lang.OutOfMemoryError: Java heap space
I did some research, and from what I read, I conclude that the program is creating many new variables, and not destroying old worthless ones. Since I am using Netbeans for development, I used the Netbeans profiler to inspect if this was the case. See the picture here:
After running the program for quite some time, more and more of the memory is used up by Byte. So it seems that my theory is still true.
I don't really know where to go from here. There is no reference to a class or specific variable, just a variable type. How can pinpoint where the problem is coming from?
UPDATE
I corrected a specific problem that was mentioned by BigMike in the comments. Previiously, I was creating many Statements in the JDBC MySQL Connector API, and I was calling .execute() to execute the statements, but I wasn't closing the statement with .close().
I made sure the add the statement.close() call after each execution, and the program runs much better now. By looking at the RAM usage for this program, it seems to solved the problem. I am also not seeing the Java heap space error anymore, which is nice.
Thanks!
It's very hard to say what might be wrong by simply that.
It might have to do with Streams that you are opening that aren't being closed when you no longer need them.
Double check methods that allocate resources (reading from files, database, etc), especially if they read data into streams, and make sure you close those streams in a finally clause.
Apart from that, you can try and profile what methods are being called more often, etc, to try and narrow down the problem to a specific part of your code.
I found a site with a reasonable explanation of how Garbage Collection works, and what can cause OutOfMemoryErrors:
http://www.kdgregory.com/index.php?page=java.outOfMemory
If you read through that, there's a specific reference to high allocation of Object[] and byte[], that might point you in the right direction.
Generally speaking, this comes about for one of two reasons:
There is a memory leak in the application, such that the application fails to release items for garbage collection, leading to the JVM running out of memory over time.
The application attempted a one-off operation that would require more memory than is available, leading to the JVM running out of memory due to the operation.
Since your output seems to indicate that the bulk of the memory is consumed by literally a million plus small byte arrays, my guess is that #1 is probably the culprit; however, to verify this, restart your application and watch it's memory consumption over time. It will bounce up and down, but really you only need to watch the trend of consumption. If the consumption average continues to climb over time, you have a memory leak.
To solve this issue, you typically need the source code, and need to find the parts of the code where the troubling objects are being created, used, and then "stored" far beyond the last time that they will ever be used. The solution is to correct the code to no longer store them. HashMaps, Lists, and other Collections are often accomplices in memory leak problems.
If you lack the source code, you can attempt to measure the trend of the memory consumption, and schedule shutdowns and restarts of the application to effectively "reset the clock" such that you choose your downtime instead of watching the application choose it for you.
If it is a one-off operation (not likely considering your data) then you won't see an upward trend in memory consumption until the triggering event occurs. In such a case, with access to the source code, you should protect your application from processing data that grows very far outside of normal operating parameters. For example, reading a message from the network typically takes only a few KB, but in exceptional circumstances a client might transmit forever. In such a case, kill the message processing and throw the message away with an error if you exceed a maximum message size limit of 10 MB.
Without access to the source code in the latter scenario, the only hope is to identify the incoming upset, hunt down the source of the errant transmission, and attempt to manipulate it to prevent the overload of output.
The variations on how to approach these techniques are vast, but now you have a few ideas.

How to speed up frequent writing

we created an java agent which does a check on our application suite to see if for instance the parent/child structure is still correct. Therefore it needs to check for 8000+ documents accros several applications.
The check itself goes very fast. We use a navigator to retrieve data from views and only read data from those entries. The problem is within our logging mechanism. Whenever we report a log entry with level SEVERE ( aka: A realy big issue ) the backend document is directly updated. This is becuase we dont want to lose any info about these issues.
In our test runs we see that everything runs smoot but as soon as we 'create' a lot of severe issues the performance drops enormously because of all the writes. I would like to see if there are any notes developers facing the same challenge.. How couuld we speed up the writing without losing any data?
-- added more info after comment from simon --
Its a scheduled agent which runs every night to check for inconsistencies. Goal is ofcourse to find inconsistencies and fix the cause and to eventualy have no inconsistencies reported at all.
Its a scheduled agent which runs every night to check for
inconsistencies.
OK. So there are a number of factors to take into account.
Are there any embedded Jars? When an agent has embedded jars the server has to detach them from the agent to the disk before they can run the code. This is done every time the agent executes. This can be a performance hit. If your agent spawns a number of times, remove the embedded jars and put them into the lib\ext folder on the server instead (requires server restart).
You mention it runs at night. By default general housekeeping processes run at night. Check the notes ini for Server Tasks scheduled and appraise what impact they have on the server/agent when running. For example:
ServerTasksAt1=Catalog,Design
ServerTasksAt2=Updall
ServerTasksAt5=Statlog
In this case if ran between 2-5 then UPDALL could have an impact on it. Also check program documents for scheduled executions.
In what way are you writing? If you are creating a document for each incident and the document contents is not much then the write time should be reasonable. What is liable to be a hit in performance is one of the following.
If you are multi threading those writes.
Pulling a log document, appending a line, saving and then repeating.
One last thing to think about. If you are getting 3000 errors, there must be a point where X amount of errors means that there is no point continuing and instead to alert the admin via SNMP/email/etc? It might be worth coding that in as well.
Other then that, you should probably post some sample code in relation to the write.
Hmm, difficult or general question.
As far as I understand, you update the documents in the view you are walking through. I would set view.AutoUpdate to false. This ensures that the view is not reloaded while you are running your code. This should speed up your code.
This is an extract from the Designer help:
Avoid automatically updating the parent view by explicitly setting
AutoUpdate to False. Automatic updates degrade performance and may
invalidate entries in the navigator ("Entry not found in index"). You
can update the view as needed with Refresh.
Hope that helps.
If that does not help you might want to post a code fragment or more details.
Create separate documents for each error rather than one huge document.
or
Write to a text file directly rather than a database and then pulling if necessary into a document. This should speed things up considerably.

Java Swing GUI frozen on refocus

I have a Swing GUI running on WinXP.
Sometimes, when i do something else (surf on the web...) and then i want to go back to my program, the GUI appears but is totally frozen, i can't do anything on it.
I have to wait (it can be 10sec or 5min) untill it works again.
I noticed the same problem when i come back from the screensaver (so I disabled it).
The machine isn't in cause, RAM and processor levels are ok.
Do you have any idea of the source of this very annoying problem? Maybe a repaint problem?
There might be many explanations to that:
Your app does some heavy operations in EDT thread (thread that controls interface updates)
There might be a UI update problems caused by errors in L&F or components (a rare case)
GC happens due to some internal call and handles the whole application (less likely)
Some native or old JDK problems with app windows (almost 0% chance that it is your case)
Usually the 1st explanation works and in that case you should just review your code and extract all "heavy" operations in a separate threads.
Anyways, i can't say anything more specific without seeing the code...

Java application memory usage

I have been writing a small java application (my first!), that does only a few things at the moment. Currently, it runs the Main class which launches a gui class (a class I wrote that extends JFrame that only contains a JTextArea), a class that loads a local file through a BufferedInputStream that is approximately 40kb, and class that loads a entry from a Java properties file.
Everything works wonderfully, however, I was watching the Windows task manager and I noticed something that struck me as odd. When I launch the application, the RAM usage jumps to about 40MB while it loads the local file and pulls a few values from it to display in the JTextArea, which seems normal to me because of the JVM, Java base classes, etc. At this point, however, when the application has finished loading the file, itmerely sits idle, as I currently don't have it doing anything else. While it is sitting idle, as long as the window is active, the application's memory usage starts climbing by 10-20kb every second. This strikes me as odd. If I click on another program to make this one the inactive window, the memory still rises, but at a much slower rate (about 10kb every 3-5 seconds).
I have not tested to see how far it would go up, but this strikes me as very odd behavior. Is this normal Java behavior? I guess it is possible that my code could be leaking memory, but I'm not sure how. I did make sure to close the BufferedInputStream I am using, and I can't see what else would cause this.
I'm sorry if my explanation doesn't make sense, but I would appreciate any insight and/or pointers anyone may have.
UPDATE:
Upon suggestion, I basically stripped my application down to the Main class which simply calls the gui class. The gui class only extends JFrame and sets the window size, close operation, and visible properties. With these changes, the memory still grows at 10-20kb, but at a slower rate. This, in conjuction with other advice I have received leads me to believe that this is just Java. I will continue to play with it and let you all know if I find out anything else interesting.
Try monitoring the heap usage with jconsole instead of the Windows task manager:
Launch your app with the -Dcom.sun.management.jmxremote option e.g.
java -Dcom.sun.management.jmxremote -jar myapp.jar
Launch jconsole from the command line, and connect to the local pid of the java process you started in the last step.
Click over to memory and watch heap memory (the default display)
If you watch for a while, you'll probably get a "sawtooth" pattern as the memory climbs over time, but then has sharp drop-offs when the garbage collector runs. You can try to "suggest" garbage collection by clicking the so-labelled button.
When you do this, does the memory usage drop down to the same minimum level, or is the overall minimum increasing over the course of several minutes? If the minimum usage increases, then you have a memory leak. If it always returns to the same minimum level, then you're fine.
Congrats on your first app! Now, a couple things to think about. First, the Windows task manager is not a great resource to understand how quickly your vm is growing. Instead, you should monitor your garbage collection stats in the console (use the -verbose:gc commandline param). Second, if you are concerned about potential leaks and the growth of the vm, there are a bunch of great profilers out there that are easy to use and can help you diagnose memory issues. check out these two posts for some profiler options.
Congratulations for your first Java app!
Java applications run in a virtual machine. The virtual machine has been assigned a fixed amount of memory by the OS, typically 512 MB. As long as the application uses less than 512 MB the garbage collector won't kick in and start searching for "dead" memory blocks. The JVM memory limit can be modified in most OSes. Try switching the memory limit to 32 MB, for example.
Is this normal Java behavior?
No.
I guess it is possible that my code could be leaking memory
That is definitely the cause. Please post your source code, otherwise further diagnosis isn't possible.
I noticed you are using Swing, make sure you are launching your JFrame in the event dispatch thread, using the invokeLater(Runnable) method.
If your are using any sort of collections, make sure you clear them once done.
Since you are doing some file IO, make sure you close all of the classes involved in in the IO operations after you are done with them.
If you are using any event listeners, remember to explicitly remove event listeners when they are no longer necessary.
One thing you could try is experimenting. Take your application and remove the file IO, see what happens. Does the memory usage still climb as before? Now resotre your application to normal, and remove the text area - does the memory still climb as before? Etc, etc. This will help you to determine what the source is, and you can focus your efforts there. Most likely you will uncover what you are after by doing this.
Another useful diagnosis tool is to use System.gc() at particular points in time, usually after the heavy-lifting blocks of code. This will tell the JVM to perform a garbage collection at that point in the execution, rather than at another time determined by memory consumption. This will help you to take into account any periodic fluctuations in the memory usage of your application.
Failing which, you can always use a memory profiler. If you are using Netbeans IDE, there's one built right into it. For Eclipse, there're several plugins which can perform profiling.
it is normal. some background calc might leave dead objects around, which JVM isn't in a hurry to clean up. eventually they will be garbage collected, when max mem is approached.
leave your program running overnight, and your machine won't blow up.

Categories

Resources