I am unable to filter what specific packages I want to look at, since the cpusettings menu is grayed out.
I am running my application from eclipse using the VisualVM runner.
CPU settings cannot be changed when a sampling is in progress. Set them before you start CPU sampling.
Related
I'm trying to use VisualVM with IntelliJ to profile a Java application. I have the VisualVM Launcher plugin installed inside of IntelliJ. I press the play button with the orange circle in IntelliJ that launches VisualVM and opens the process when I start the run. However, when I try to profile the CPU, it doesn't seem to profile the methods in my program. I've tried with several different programs and can't seem to get any of them to work with VisualVM. This is what VisualVM looks like this:
The profiler seems to think that the total time is 857 ms or 6.21 ms when in reality my program takes about a minute to run. It seems to be capturing "DestroyJavaVM" which is not my program. I'm using VisualVM because it is the only free Java profiler I could find. Any suggestions? Here are my VisualVM settings:
As others have suggested, take a look at your "Start profiling from class" setting.
But you might want to consider it being a timing issue. As you can see in the background, the process you want to debug is already finished.
Check in the call tree and in the list of processes to the left what you are debugging. In your screenshot you are debugging the destruction of the JVM. That does not include your code, so you should not see it there.
How might I compute or determine the memory consumption of a program written in Java using NetBeans?
Since you use Netbeans, there is a Profile Main Project button right next to the debug button of you tool bar. The option Memory allows you to monitor the memory graphically while your Main project runs. The Window-->Profiler-->Telemetry overview gives you a graph of your memory consumption similar to this one:
and even:
Among all sorts of other details...
If you want to compute the memory from your Java program you can calculate it at any time with this code:
double currentMemory = ( (double)((double)(Runtime.getRuntime().totalMemory()/1024)/1024))- ((double)((double)(Runtime.getRuntime().freeMemory()/1024)/1024));
It will give you the memory usage in megabytes. You could use this code to check the memory usage at different time and keep the maximum memory usage or make some statistics.
Also it will work even if you don't use NetBeans.
Netbeans has an awesome built-in profiler. Start out by going to Profile -> Advanced Commands -> Run Profiler Calibration to get it set up. Once you're done, you can profile by clicking the stopwatch at the top of your screen, to the right of "Debug Project".
If you want to check memory consumption and other statistics you cant try.
$JAVA_HOME/bin/jconsole
It let you choose the process PID of your application and see the Memory and CPU usage at runtime.
NOTE: You don't need to have Netbeans installed so it can be used in production environments also.
I have Eclipse Helios SR1 installed on Windows XP. I am writing/debugging Java code using JDK 1.6.
When I debug and I hit a breakpoint, Eclipse is fast to show me the stacktrace. (See #1 in attached image.)
However, the source code line highlight (light green, see #2 in attached image) is very slow to appear. Oddly, when I first installed Eclipse, this was very fast. Now it is very slow. It takes about 15 seconds to highlight as light green.
Any ideas what is wrong with my Eclipse install/config?
FYI: Very fast processor + 4GB of RAM. Plenty of disk space. I have tried a "Hello, World" test Java project. Just a few lines of code... still the same issue when hitting a vanilla breakpoint.
According to running-a-program-in-debug-mode-is-incredible-slow I succeeded with running
eclipse -clean
(test this before you setup a new workspace)
This is surely not an Eclipse problem. If it is highlighting, means its working.
There must be something wrong with the Windows. More RAM does not mean necessarily fast processing. Check Task Manager, and try to monitor processes, especially the java one. There can be multiple
java processes, kill unnecessary ones.
If the laptop using some sort of disk encryption, then this is surely possible.
If your anti-virus is hogging the CPU, quite possible.
Or else you can do one more thing is that everyday you can manually clean your project and you can also set the console limit to unlimited.
The answer is simple: Create a new workspace.
I did it and now my debugger is super-fast again.
The auto complete stalls so frequently and for so long, I quit using it altogether.
I've had success with the following using Eclipse (Classic) 3.6.1 on Windows 7 x64.
"A workaround, until the fix is released in 3.6.2 is summarized here: http://groups.google.com/group/android-developers/msg/0f9d2a852e661cba"
(copied for convenience)
"You can replace your /plugins/
org.eclipse.jdt.core_3.6.1.v_A68_R36x.jar plugin with one from
http://www.google.com/url?q=http://adt-addons.googlecode.com/svn/patches/org.eclipse.jdt.core_3.6.1.v_A68_R36x.zip&ei=vg5aTf2RIMrUgAeI-qTvDA&sa=X&oi=unauthorizedredirect&ct=targetlink&ust=1297749446528273&usg=AFQjCNFv7FGlTrnoVhRGE35JPjHxOwI_Bw
and restart Eclipse. Content Assists will be much better. Just try it.
Don't forget backup your original plugins. "
This solved part of my problem.
In preferences, I defaulted all the 'Java->Editor->Content assist' screens and the performance is much improved. Any lag I have now is due to system speed and is negligible. I've gone from minutes to seconds building the suggestion list.
UPDATE: This didn't completely solve my problem, but it got me close. The search continues...
UPDATE: I'm developing in Java for Android using the default packages that are included and any that might have come down during a update(in retrospect, maybe choosing update all in the SDk update might not have been wise). The timing is fairly consistent online and offline. I did a few tests and found the following:
Startup Eclipse and enter a line of code that can use a .toString(). Typing the '.' populates the auto complete within 2-3 seconds. Type a 't' and it takes 70-75 seconds. After that, 10 seconds. Diff objects do the same thing(75 the first time, 10 after that). It's the filtering process that appears to stall. My CPU does not max, Memory is OK, but the program will go not responding till it's done. Any typeahead gets cached and eventually filters the list when Eclipse starts responding.
For me the problem went away when I increased the memory for the vm.
Put this in your eclipse.ini:
-Xms512m
-Xmx1024m
on my 4GB Windows Vista system this would happen A LOT !! (as well as debug issues when looking up variables).
This all went away after I built my new PC with 8GB RAM. I can now run 4 emulators simultaneously and it doesn't have any debug problems any more either. Auto complete with huge lists also works just fine.
it would seem to be just an issue with how much RAM you've got.
Recently, while working on a JSF web app, using Netbeans 6.8, I am constantly getting PermGen: Out Of Memory Errors. I have also noticed that this is not related to hot swapping the code, as some people suggested on the forums; I generally restart my local web server, Tomcat 6.0, whenever I redeploy the code. This used to happen to me once in awhile, but as of late, it was been occurring constantly. I usually can't go more than two minutes before it crashes.
The important observation I've made about this problem, is that it only seems to happen when running the debugger. If I launch the server, regularly, it will run indefinitely. As soon as I run in debug mode, this problem occurs.
I've tried all the tips I've found so far of increasing the JAVA_OPT memory settings for Java in Tomcat; I've tried increasing the available memory for Netbeans in netbeans.conf. Still no luck. If you want to see the specific configuration changes I've made, I can post that as well.
I've also read that this can be a result of memory leaks in Java. I've tried running Netbean's profiler, but it would generally crash as well before I could do anything really useful. Additionally, when it did run, all the object allocations with ridiculous generations were things in java libraries, or primitives -- char[]s were the biggest memory hog of the app, for example, with the largest generations.
I would really like to know if anyone has had a similar problem before, and if so, how they solved it. This is starting to seriously impede my ability to do my work.
Thanks for any help.
add this entry in catlina.sh(or bat), it worked for me
JAVA_OPTS="-Djava.awt.headless=true -Dfile.encoding=UTF-8
-server -Xms1536m -Xmx1536m
-XX:NewSize=256m -XX:MaxNewSize=512m -XX:PermSize=512m
-XX:MaxPermSize=512m -XX:+DisableExplicitGC
Something I have found useful to track down memory leaks without running a profiler or a debugger is to use the "jmap -histo " command (comes with the jdk). Save the output of this program to a file. Run this every few minutes while your application is running. Collect up the outputs and look for objects that are always increasing in number and size. I even wrote a quick app to graph selected objects over time to really highlight run away objects just to make it easier to see where leaks might be occurring.