How can I reduce Eclipse Ganymede's memory use? - java

I use the recent Ganymede release of Eclipse, specifically the distro for Java EE and web developers. I have installed a few additional plugins (e.g. Subclipse, Spring, FindBugs) and removed all the Mylyn plugins.
I don't do anything particularly heavy-duty within Eclipse such as starting an app server or connecting to databases, yet for some reason, after several hours use I see that Eclipse is using close to 500MB of memory.
Does anybody know why Eclipse uses so much memory (leaky?), and more importantly, if there's anything I can do to improve this?

I don't know about Eclipse specifically, I use IntelliJ which also suffers from memory growth (whether you're actively using it or not!). Anyway, in IntelliJ, I couldn't eliminate the problem, but I did slow down the memory growth by playing with the runtime VM options. You could try resetting these in Eclipse and see if they make a difference.
You can edit the VM options in the eclipse.ini file in your eclipse folder.
I found that (in IntelliJ) the garbage collector settings had the most effect on how fast the memory grows.
My settings are:
-Xms128m
-Xmx512m
-XX:MaxPermSize=120m
-XX:MaxGCPauseMillis=10
-XX:MaxHeapFreeRatio=70
-XX:+UseConcMarkSweepGC
-XX:+CMSIncrementalMode
-XX:+CMSIncrementalPacing
(See http://piotrga.wordpress.com/2006/12/12/intellij-and-garbage-collection/ for an explanation of the individual settings). As you can see, I'm more concerned with avoiding long pauses during editting than actuial memory usage but you could use this as a start.

I don't think the JVM does a lot of garbage collection unless it has to (i.e. it's getting to its limits). Therefore it grabs all the memory it can get, probably up to the limit set in the eclipse.ini (the -Xmx argument, set to 512MiB here).
You can get a visual representation of the current heap status by checking 'Preferences' -> 'General' -> 'Show heap status'. It will create a small gauge in the status bar which also has a 'trash can' button you can use to trigger a manual garbage collection.

Just for information,
you can add
-Dcom.sun.management.jmxremote
to your eclise.ini file, launch eclipse and then monitor its memory usage through 'jconsole.exe' found in your jdk installation.
C:\[jdk1.6.0_0x path]\bin\jconsole.exe
Choose 'Connection / New connection / 'eclipse' to monitor the memory used by eclipse
always use the latest jvm to launch your eclipse (that does not prevent you to use any other jfk to compile your project within eclipse)

The Ganymede Java EE plugins are absolutely huge when running in memory. Also, I've had bad experiences with FindBugs and its reliability over a long coding session.
If you can't live without these plugins though, then your only recourse is to start closing projects. If you limit the number of open projects in your workspace, the compiler (and FindBugs) will have less to worry about and your memory usage will drop tremendously.
I usually split up my workspaces by customer and then only keep the bare-minimum projects open within each workspace. Note that if you have a particularly large projects (especially ones with a lot of files checked by WST), that will not only chew through your memory, but also cause a noticeable pause in responsiveness when compiling.

Eclipse by itself is pretty bloated, and the more plugins you add only exacerbates the situation. It's still my favorite IDE, as it certainly isn't short on functionality, but if you're looking for a lightweight IDE then I'd suggest ditching Eclipse; it's pretty normal to run up half a gig of memory if you leave it running for awhile.

Eclipse is a pretty bloated IDE. You can minimize it by turning of the automatic project building under Project -> Build Automatically. It also can be helped by closing any open project you are not currently working on.

I'd call it bloated, but not leaky. (If it was leaky it would climb and climb until something crashed.) As others have said, memory is cheap! It seems like a simple decision to me: spend a tiny bit on more memory vs. lose productivity because you don't have the memory budget to run Eclipse # 500MB.
Summarized rhetorical question: What is more valuable:
The productivity gained from using an IDE you know with the plug-ins you want, or
Spending $50-200 on some memory?

RAM is relatively cheap (not that this is an excuse for poor memory managmentment). Unused memory is essentially WASTED memory. If you're hitting limits and the IDE is the problem consider less multitasking, adjusting your memory reqs, or buy more. I wouldn't cripple Eclipse if that's your bread-and-butter IDE.

Instead of whining about how much memory Eclipse takes, just go ahead and analyze where the problem is. I might be just one plugin.
Check the blog here :
"analyzing memory consumption of eclipse"
Regards,
Markus

I had problem with java-based programs memory consumption. I found that it could be related to the chosen jvm (in my case it was). Try to run eclipse with -client switch.
In some operating systems (most of linux distros I believe), the default option is server vm, which will consume noticeable more memory when running applications with gui.
In my case initial memory footprint went down from 300MB to 80MB.
Sorry for my crappy English. I hope I helped.
All Regards
Arkadiusz Jamrocha

Well, you don't specify on which platform this occurs. The memory management may vary if you're using Windows XP, Vista, Linux, OS X, ...
Usually, on my computer (WinXP with 1Gb of Ram), Eclipse take rarely more than 200Mb, depengin of the size of the opened projects, the loaded plugins and the ongoing action.

I usually give Eclipse 512 MB of RAM (using the -Xmx option of the JVM) and I don't have any memory problems with Ganymede. I upgraded to two GB of RAM a few months ago, and I can really recommend it. It makes a huge difference.

Eclipse generally keeps a lot of meta-data in memory to allow for all kinds of IDE gymnastics.
I have found that the default configuration of Eclipse works well for most purposes and that includes a limit (either given explicitly or implictly by the JVM) to how much memory can be consumed, and Eclipse will stay within that.
Is there any particular reason you are concerned about memory usage?

Related

How can I find out why Tomcat is using so much memory and stop it?

I'm profiling my webapp using YourKit Java Profiler. The webapp is running on tomcat 7 v30, and I can see that the heap of the JVM is ~30 megabytes, but Tomcat.exe is using 200 megabytes and keeps rising keeps rising.
Screenshot: http://i.imgur.com/Zh9NGJ1.png
(On left is how much memory profiler says Java is using, on right is Windows usage of tomcat.exe)
I've tried adding different flags to tomcat, but still the memory usage keeps rising and rising. I've tried precompiling my .jsp files as well in case that helps, but it hasn't.
The flags I've added to tomcat's java flags:
-XX:+UseG1GC
-XX:MinHeapFreeRatio=10
-XX:MaxHeapFreeRatio=10
-XX:GCTimeRatio=1
Tomcat is also running as a windows service if that matters at all.
I need assistance figuring out how to get tomcat to use less memory/know why it's using so much memory. As is is now, it keeps going until it uses the whole system's memory.
So the solution that I found was to add some flags to the tomcat run.
Not sure which flag it was. I think it might've been the jacob library we were using, or some combo of these flags with that. Hopefully this can help people in the future.
-XX:+UseG1GC
-XX:MinHeapFreeRatio=10
-XX:MaxHeapFreeRatio=10
-XX:GCTimeRatio=1
-Dcom.jacob.autogc=true
-Dorg.apache.jasper.runtime.BodyContentImpl.LIMIT_BUFFER=true
You should look for memory leaks in your application, or large sessions that live too long and not invalidated. Try to think which functionality holds too many objects for long periods.
You could dump Yor memory and see what is using it. Propably it will be a long list of Your application objects, or strings You unknowingly internalize.
You might use a tool like jvisualvm, or a cool eclipse tool: http://www.eclipse.org/mat/ to do that.
If You do that and still dont know why, then post us what objects are in Your memory....

Scala debugging under IntelliJ is horribly slow

I'm currently tracing through a relatively small Scala program (Apache's Kafka) with IntelliJ 10.5.4, while at the same time running several other Java applications in another project. While the Java applications are doing just fine, the Scala debugging is horribly slow, a simple "Make Project" will take easily 2 minutes, and even basic cursor keystrokes take up to a second each to reflect on the screen.
I've used IntelliJ before with as many as 6 applications running at the same time, under multiple projects, and it doesn't have any problems at all. Hardware isn't an issue either, this is all running on a very recent Macbook Pro (256GB SSD, 8Gb RAM, quad-core i7), under Java 1.6.0_31.
Are there any tips/tricks to making Scala perform decently while debugging under IntelliJ? Alternatively, what are people out there using for Scala debugging?
Try latest Idea 11 EAP with latest Scala plugin. It works for me fine.
tl;dr
re: compile times, do you have FSC enabled? This dramatically helps Scala compile times.
re: overall slowness, you probably need to tweak your JVM settings. Scala can be more memory intensive, so you may have to increase your -Xmx value to spend less time garbage collecting. Or lower it, if it's too dramatically high. Or change the garbage collector. For reference, here are mine:
<key>VMOptions</key>
<string>-ea -Xverify:none -Xbootclasspath/a:../lib/boot.jar -XX:+UseConcMarkSweepGC </string>
<key>VMOptions.i386</key>
<string>-Xms128m -Xmx512m -XX:MaxPermSize=250m -XX:ReservedCodeCacheSize=64m</string>
<key>VMOptions.x86_64</key>
<string>-Xms128m -Xmx800m -XX:MaxPermSize=350m -XX:ReservedCodeCacheSize=64m -XX:+UseCompressedOops</string>
Actually figuring it out
You will probably have to do some profiling to find out what the performance problem actually is. The simplest thing to start with would be to open Activity Monitor to see if memory is low, CPU utilization is high, or disk activity is high. This will at least give you a clue as to what's going on. You have an SSD, so disk I/O is probably not a problem.
You may need to profile IDEA itself, such as running it with YourKit Java Profiler. This could tell you something interesting, such as if IDEA is spending an excessive amount of time garbage collecting.
In general, though, Scala seems a bit more memory intensive than Java on IntelliJ. It's possible that you have -Xmx set too low, causing excessive garbage collections. Or maybe it's set too high, so that when it does garbage collect, you get a pauses in the app. Changing which collector your using may help with that, but with all the collectors there's a point of diminishing returns where setting -Xmx too high causes performance problems.
I found out that I accidentally put breakpoint in one of the List's methods. It made intellij super slow while debugging scala project but when I canceled the breakpoint intellij became normal again.
The list of breakpoints can be found by clicking ctrl+shift+F8.
Eclipse with http://scala-ide.org/download/current.html is working very well for me as of 2.0.1.
My stuff is maven based at the moment and the projects are combined Java and Scala. I can debug across language boundaries without problems.

GWT: High resource usage during mvn goals

I have noticed a high memory and CPU usage during mvn-gwt operation especially during compile phase. Memory usage just soars. I just wanna know if this is normal and if anyone else is experiencing this.
My current JVM setting is -Xms64m -Xmx1280m -XX:MaxPermSize=512m
I think it's normal. Because the phase of compilation in GWT is really very resource-intensive. GWT provides a larger library (in gwt-user.jar) that must be analyzed during compilation and a number of compiler optimizations that require much memory and processing power. Thus, the GWT compiler use much memory internally.
Yes, it's normal. It derives from the awsome CPU utilization Google made when they've written the gwtc command (gwtc = GWT Compile).
I consider it good since the tradeoff for the CPU would typically be memory usage which is far more valuable for me.
(I do not work for Google :-))
The GWT compiler has a localWorkers setting that tells it how many cores to use. The more cores, the more memory it will use. If you are using the Eclipse plugin it defaults to just using one (I believe). But the Maven plugin defaults to using all core on your machine (ie if you have a quad core, it will use localWorkers 5.
Interestingly I've been following the advice found here: http://josephmarques.wordpress.com/2010/07/30/gwt-compilation-performance/ which says that localWorkers 2 is an ideal setting for memory usage and speed. That way my machine doesn't lock up during the compile, and the speed difference is very minor.

What's a good free tool for investigating unintentional object retention in Java?

My multithreaded Java program crashes because it runs out of heap space and I don't think it should. Assuming the culprit is unintentional object retention, what's a good free tool to investigate what objects are being unintentionally retained?
My IDE is Eclipse.
Here's a list of open source tools you can look at: http://java-source.net/open-source/profilers . Of course, JMap and JConsole are also possible solutions.
A tool like Eclipse MAT will help to find greedy memory pigs and has even a memory leak detector.
The memory profiler of Visual VM might also help if you need to go at a lower level.
The last time I looked into free profilers, they weren't nearly as good as the established commercial ones.
I recommend evaluating
JProfiler
YourKit
JProbeand investing the money for a license of the tool you like best.
A good profiler, compared to a bad one, can easily save you a day of debugging work immediately and that pays for the license (and for the people doing the great job developing these nice tools).
All three plug into Eclipse and allow you to start profiling directly from Eclipse, from your current project, so there is no tedious work to set up the CLASSPATH.
Sun's VisualVM is free, but I am a big fan of JProfiler which is a commercial app, although you can get a 30 day trial.
I would start with tools that come with JDK, jconsole and jmap. There is good article about JVM monitoring on java.sun.com.

Why does Eclipse crash with Xmx, XX:MaxPermSize above certain values?

Running Eclipse 3.5.1, JDK 1.6.0_17 on WinXP SP3 32Bit with 3.5 gigs of RAM.
I'm aware of the known Eclipse best practices, still trying to figure out eclipse.ini.
This will launch: -Xmx588m and this won't: -Xmx589m.
Same with -XX:MaxPermSize. Anything above -XX:MaxPermSize=140m won't launch.
The screen of death is simlar to this (taken from here).
Any ideas on why might this be happening?
See this eclipse bug.
The general problem is the jvm requires a contiguous block of memory for this. On windows, the process will be getting 2 Gigs, other libraries that get loaded are placed in different areas of the memory space. If a library happens to get placed in the middle, in basically cuts in half the size you can use.
The Eclipse launcher will be loading some system dlls to do graphics, and in particular, user32.dll can result in 3rd party libraries getting loaded depending what is installed on your machine (seen here).
To achieve higher memory limits, you can force the jvm to be forked into a separate process from the eclipse launcher. The jvm process won't be loading these extra libraries until after the vm has initialized its memory. Do this by using a -vm argument pointing to a javaw.exe:
eclipse -vm C:\jdk\jre\bin\javaw.exe
See also this bug 203325 for suggestions about how monitoring how your memory is used, with tools like:
JConsole
Memory Monitor tool (eclipse tool)
Note: as commented here:
If You do not declare the maximum memory limit then it is left to the OS memory manager.
The -Xmx <size> is useful for programming a small device like J2ME.
If you are programming for a small device which is memory constrained then you can use -Xmx JVM property to emulate the small device, even though you may be using a PC.

Categories

Resources