Matlab memory management; insufficient java heap - java

I was hoping that someone on here would be able to explain or point me to a webpage where I could learn more about Matlab's memory management. I know that Matlab is a higher level language so it takes care of memory management, which is good and bad. Good cause I don't need to worry about it and bad cause I have no idea what it is doing under the hood.
The reason I ask is lately I've been getting this error message a lot.
Insufficient Java heap memory to continue operation
Granted I'm using a mid 2010 15" Macbook pro, with only 4 GB of RAM, not really the best computer to be perform all the image operations that I do. I know Matlab has a delete function, and I didn't know when/if it would be helpful to use this function to save memory? I have used the delete function before only in hardware related tasks when I am sending data through serial I delete my serial object. But beyond that should I be using delete for my own memory management?

See this question. To prevent Java heap error, you will need to change JVM options. Change the default value in Matlab preferences or create a new java.opts file with -Xmx (and optionally -Xms) options, e.g.,
-Xmx1g

To set the equivalent of the -Xmx parameter in more recent versions of Matlab, go to Preferences (on the toolbar/ribbon), then
MATLAB > General > Java Heap Memory
There is a slider there. Matlab will have to be restarted for this to take effect.
There is a warning about doing this, but as I wrote elsewhere, I think it can be ignored.
In case you only have non-graphical access to Matlab through a terminal, the effect of the above command was to add the following line to ~/.matlab/R2016a/matlab.prf, so you could do that manually instead.
JavaMemHeapMax=I36532
This line appeared after I used the slider to set the preference to 36,532 MB and quit Matlab. If you can't find your matlab.prf file, check here.

Related

JVM allocates way more than necessary?

My java heap is allocating at around 123 MB. I need this to be less. I have a 1 GB limit and both programs running are servers. One runs at 953 MB. The server JAR I am trying to run should only take up 10 MB, or less. How can I make ubuntu respond the same as other OS's I have tested the JAR on? My code can be found at GitHub.
Java Version: JDK/JRE-7
Out-of-the-box Java on *nix can look a little scary when you just look at it via top. The java executable often puts up huge numbers under the VIRT column, like 900m. Why is my small Java program using 900m of RAM?
Actually, it's probably not using 900m of RAM. The JVM has told the OS "I might use this much memory... be prepared". But it's probably not actually using anywhere near that much physical RAM -- and if it's a small program, it'll never come anywhere near that. Any physical RAM that java is not actually using is still freely available to other processes on the system.
For a more accurate picture of how much physical RAM the java process is using, look under top's RES column. Though, a full discussion of *nix memory management and profiling Java is probably outside the scope of this answer. I'd encourage you to try Googling the topic and developing specific questions based on the material you find.
Most of the time your Java programs (and other programs running along side them) are going to do just fine using Java's default memory settings. Sometimes you need to limit (or increase) the maximum amount of heap memory that JVM is allowed to allocate. This is the most commonly tuned Java memory setting, and it is usually set with the -Xmx command-line argument. You can read more about it here and here.
Sometimes it can be a little bit tricky figuring out where to modify java's command-line options if your Java program is being magically started for you, e.g., as a system service, or part of some larger script. Googling Xmx will probably get you started on the conventional way of modifying java arguments for that product.
For example Google search: ubuntu tomcat Xmx
Gives links that point us in the direction of /etc/default/tomcat6.

Eclipse not releasing memory in Java process on Linux

My Linux server need to be able to handle 30+ eclipse instances for developers. I did a quick test of running 10 eclipse instances. The Java process associated with each eclipse initially around 200MB RSS memory, increased up to around 550MB, when more projects are loaded.
But Java process doesn't seem to release memory, after closing/deleting all projects within eclipse instances. I still see it uses over 550MB RSS.
How can I change Eclipse or Java settings so that memory foot print got reduced when developers closed down projects or being idle for a while?
Thanks
You may want to experiment with these (and other) JVM tuning options to make the JVM less reluctant to return memory to the OS:
-XX:MaxHeapFreeRatio Maximum percentage of heap free after GC to avoid shrinking. Default is 70.
-XX:MinHeapFreeRatio Minimum percentage of heap free after GC to avoid expansion. Default is 40.
However, I suspect that you won't see the eclipse process shrink to anywhere near its initial size, since eclipse is a huge, complex application that probably lazy-loads (but does not unload, once used) a lot of classes and associated data structures.
I've never seen Java release memory.
I don't think you will get any value out of trying to get it to release memory with Eclipse, I've watched that little memory counter for YEARS and never once see the allocated memory drop.
You might try one of these.
After each session, exit the JVM and restart.
Set your -Xmx lower.
Separate your instances into categories with high -Xmx and low -Xmx and let the user determine which one he wants.
As a side-thought, if it really mattered to you, you MIGHT be able to run multiple eclipse instances under one VM. It would probably be WAY too much work (man-weeks to man-years), but if you could get it right you could reduce overhead by like 150-200mb/instance. The disadvantage would be that a VM crash (Pretty rare these days) would kill everyone.
Testing this theory would be a matter of calling eclipse's main from within an existing JVM and trying to get it to display somewhere useful. The rest of the man-year is spent trying to figure out where they used evil static variables or singletons and changing them to something else.
Switch the Java to use the G1 garbage collector with the HeapFreeRatio parameters. Use these options in eclipse.ini:
-XX:+UnlockExperimentalVMOptions
-XX:+UseG1GC
-XX:MinHeapFreeRatio=5
-XX:MaxHeapFreeRatio=25
Now when Eclipse eats up more than 1 GB of RAM for a complicated operation and switched back to 300 MB after Garbage Collection the memory will be released back to the operating system.
I would suggest checking on garbage collection, setting right options or even forcing GC periodically might increase time till eclipse memory usage grows high.
Following link might be useful http://www.eclipsezone.com/eclipse/forums/t93757.html

java process on windows using less memory than specified in -xms?

I'm starting my server with "java -xms 1280m -xmx 1280m" command. On Linux machines, this works fine and I see the process using almost the same amount of memory. On Windows machines, however, I see the java process using much less than 1280m - around 500-600m. I gathered this data from the windows task manager, if that matters. The two windows machines I checked are both Windows 2003 servers and have 2GB and 3GB RAMs respectively.
I always thought that specifying the initial heap size with -xms will force java to use at least that much of memory. Am I wrong? Or, is this a peculiarity with java on Windows?
Look closer. The task manager is often misleading - by default it will not show how much memory a process has allocated. Rather what is shown as "memory used" is the amount of physical memory swapped in for that process.
In the View menu, chose "Select columns" and add "Size of virtual memory". There's your memory. Your application obviously never really uses more than 500-600m, so its never swapped in.
The windows task manager has been designed for end users, not for programmers. The latter usually prefer the Process Explorer (procexp.exe) from the Sysinternals suite. That, combined with vmmap.exe will show you exactly what is going on.
Finally back at a computer and ran a couple of quick tests.
On my windows XP machine running java -xms gives the output Unrecognised option
When running java -Xms I get an invalid intial heap size which is correct as I'm not giving any value, but it accepts and recognises the option.
So it seems my comment was valid and you'll need to sort the casing on your command.
In addition to what Kevin D said about capitalization, note that 32-bit Windows systems generally have an upper-bound on the max heap size. It tends to vary based on a lot of factors but I've often seen it right around the 1280m that you are trying. I doubt that is the issue here but it could be a related issue.

Java using too much memory on Linux?

I was testing the amount of memory java uses on Linux. When just staring up an application that does absolutely NOTHING it already reports that 11 MB is in use. When doing the same on a Windows machine about 6 MB is in use. These were measured with the top command and the windows task manager. The VM on linux I use is the 1.6_0_11 one, and the hotspot VM is Server 11.2. Starting the application using -client did not influence anything.
Why does java take this much memory? How can I reduce this?
EDIT: I measure memory using the windows task manager and in Linux I open the terminal and type top.
Also, I am only interested in how to reduce this or if I even CAN reduce this. I'll decide for myself whether a couple of megs is a lot or not. It's just that the difference of 5 MB between windows and Linux is strange, and I want to know if I am able to do this on Linux too.
If you think 11MB is "too much" memory... you'd better avoid using Java entirely. Seriously, the JVM needs to do quite a lot of stuff (bytecode verifier, GC, loading all the essential classes), and in an age where average desktop machines have 4GB of RAM, keeping the base JVM overhead (and memory use in generay) very low is simply not a design priority.
If you need your app to run on an embedded system (pretty much the only case where 11 MB might legitimately be considered "too much"), then there are special JVMs designed for such sytems that use less RAM - but at the cost of lacking many of the features and/or performance of mainstream JVMs.
You can control the heap size otherwise default values will be used, java -X gives you an explanation of the meaning of these switches
i.g.
set JAVA_OPTS="-Xms6m -Xmx6m"
java ${JAVA_OPTS} MyClass
The question you might really be asking is, "Does windows task manager and Linux top report memory in the same way?" I'm sure there are others that can answer this question better than I, but I suspect that you may not be doing an apples to apples comparison.
Try using the jconsole application on each respective machine to do a more granular inspection. You'll find jconsole on your sdk under the bin directory.
There is also a very extensive discussion of java memory management at http://www.ibm.com/developerworks/linux/library/j-nativememory-linux/
The short answer is that how memory is being allocated is a more complex answer than just looking at a single figure at the top of a user simplifed system utility.
Both Top and TaskManager will report how much memory has been allocated to a process, not how much the process is actually using, so I would say it's not an apples to apples comparison. Regardless, in the age of Gigs of memory what's a couple megs here or there on startup?
Linux and Windows are radically different operating systems and use RAM very differently. Windows kind of allocates as you go, and Linux caches more at once, and prepares for the future, so that the next operations are smooth.
This explanation is not quite right, but it's close enough for you.

Why do we have to increase the Java Heap?

I know how to set the Java heap size in Tomcat and Eclipse. My question is why? Was there an arbitrary limit set on the initial heap back when Java was first introduced so the VM wouldn't grow over a certain size? It seems with most machines today with large memory space available this isn't something we should have to deal with.
Thanks,
Tom
Even now, the heap doesn't grow without limit.
When the oldest generation is full, should you expand it or just GC? Or should you only expand it if a GC doesn't free any memory?
.NET takes the approach you'd like: you can't tell it to only use a certain amount of heap. Sometimes it feels like that's a better idea, but other times it's nice to be able to have two processes on the same machine and know that neither of them will be able to hog the whole of the memory...
I glanced by this the other day, but I'm not sure if this is what you want: -XX:+AggressiveHeap. According to Sun:
This option instructs the JVM to push
memory use to the limit: the overall
heap is more than 3850MB, the
allocation area of each thread is
256K, the memory management policy
defers collection as long as possible,
and (beginning with J2SE 1.3.1_02)
some GC activity is done in parallel.
Because this option sets heap size, do
not use the -Xms or -Xmx options in
conjunction with -XX:+AggressiveHeap.
Doing so will cause the options to
override each other's settings for
heap size.
I wasn't sure if this really meant what I thought it meant, though - that you could just let the JVM gobble up heap space until it is satisfied. However, it doesn't sound like it's a good option to use for most situations.
I would think that it's good to be able to provide a limit so that if you have a memory issue it doesn't gobble up all the system memory leaving you with only a reboot option.
Java is a cross-platform system. Some systems (like Unix and derviates) have a ulimit command which allows you to limit how much memory a process can use. Others don't. Plus Java is sometimes run embedded, for example in a web browser. You don't want a broken applet to bring down your desktop (well, that was at least the idea but applets never really caught on but that's another story). Essentially, this option is one of the key cornerstones for sandboxing.
So the VM developers needed a portable solution: They added an option to the VM which would allow anyone (user, admin, web browser) to control how much RAM a VM could allocate tops. The needs of the various uses of Java are just too diverse to have one size fits all.
This becomes even more important today when you look at mobile devices. You desktop has 2-8GB RAM but your mobile has probably much less. And for these things, you really don't want one bad app to bring down the device because there might not even be a user who could check.

Categories

Resources