Can't consume entire memory - java

I am trying to write a program which should consume memory of a specific size. An issue I am wondering of is that I am getting outOfMemory exception when there is actually a free space in the heap.
Here is the code:
import java.util.Vector;
import java.lang.*;
public class MemoryEater1 {
public static void main(String[] args) {
try {
long mb = Long.valueOf(args[0]);
Vector v = new Vector();
Runtime rt = Runtime.getRuntime();
while (true) {
if (v.size() > 0) {
if (((long) v.size())*100 < mb) {
System.out.println("total memory: " + rt.totalMemory()/1024/1024);
System.out.println("max memory: " + rt.maxMemory()/1024/1024);
System.out.println("free memory: " + rt.freeMemory()/1024/1024);
System.out.println("Trying to add 100 mb");
//100mb
byte b[] = new byte[104857600];
v.add(b);
}
} else {
//100mb
byte b[] = new byte[104857600];
v.add(b);
System.out.println("Added 100 mb");
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
The command to start it:
java -Xmx4096m MemoryEater1 3000
And the output:
total memory: 2867
max memory: 3641
free memory: 59
Trying to add 100 mb
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at MemoryEater1.main(MemoryEater1.java:18)
Well the difference between max memory and total memory is 774mb, which should be enough to consume 100mb more, but still there is the error, and even the machine resources are sufficient enough:
[user#machine ~]$ free -m
total used free shared buffers cached
Mem: 15950 3447 12502 0 210 2389
-/+ buffers/cache: 847 15102
Swap: 4031 1759218603 8941
Why can that be?

I don't think it's fragmentation as you only have one thread allocating memory and not reclaiming anything.
It's your particular garbage collector to blame, they manage memory differently resulting in more or less being unavailable for your app. You can find out which one is used by analyzing the output of java -XX:+PrintCommandLineFlags).
You can try to use G1 which manages memory differently.
java -Xmx4096m -XX:+UseG1GC MemoryEater1 3000
Or play with generation sizes e.g. -XX:NewSize and so on.
For more information read up VM options and anything on garbage collector algorithms e.g. [GC tuning]
Here's a quick illustration how splitting memory for different generations can make it unavailable (http://www.oracle.com/technetwork/java/javase/gc-tuning-6-140523.html).

It is possible to set the JVM flags for an amount of memory greater than the physical memory on your machine. With the information given so far, it seem most likely that you lack the physical memory to allocate this array. In other words, the JVM requests more memory from the OS and the OS says there's none to give.
Another possibility is the memory fragmentation issue noted in the comments to your question. I think that less likely, however, in this case because of the structure of your program. I don't think it can be ruled out though.

Related

Finding how much virtual memory is allocated to a process

Is there a way to find out how much total virtual memory is allocated to a specific process? I have a program and I am also putting in a performance tracker which will monitor how much memory the process is using and how much is left for it to use.
To do this, I need to know how much memory is allocated to the process. Is there a way to do this in Java? I am running on Windows 7.
Also, I have currently been using the Sigar classes to monitor other memory statistics. Does sigar have a specific class/function that can find what I am looking for?
You can use Visualvm.
In your code to calculate memory used:-
Runtime runtime = Runtime.getRuntime(); // Get the Java runtime
long memory = runtime.totalMemory() - runtime.freeMemory(); // Calculate the used memory
To add more on what #Mitesh mentioned,
int var=1; // for bytes. you can change here to print in MB or KB as you wish
log.info("************************* PRINTING MEMORY USAGE - BEGIN **************");
Runtime runtime = Runtime.getRuntime();
/* Total number of processors or cores available to the JVM */
log.info("Available processors (cores): "
+ runtime.availableProcessors());
/* This will return Long.MAX_VALUE if there is no preset limit */
long maxMemory = runtime.maxMemory();
/* Maximum amount of memory the JVM will attempt to use */
log.info("Maximum memory : "
+ (maxMemory == Long.MAX_VALUE ? "no limit" : maxMemory / var));
/* Total memory currently available to the JVM */
long totalMemory = runtime.totalMemory() / var;
log.info("Total memory available to JVM : "
+ totalMemory);
/* Total amount of free memory available to the JVM */
long freeMemory = runtime.freeMemory() / var;
log.info("Free memory : " + freeMemory);
// Calculate the used memory
long usedMemory = totalMemory - freeMemory;
log.info("Used memory : " + usedMemory);

Allocating an array of 1 billion integers throws OutOfMemoryError [duplicate]

Using the -Xmx1G flag to provide a heap of one gigabyte, the following works as expected:
public class Biggy {
public static void main(String[] args) {
int[] array = new int[150 * 1000 * 1000];
}
}
The array should represent around 600 MB.
However, the following throws OutOfMemoryError:
public class Biggy {
public static void main(String[] args) {
int[] array = new int[200 * 1000 * 1000];
}
}
Despite the array should represent around 800 MB and therefore easily fit in memory.
Where's the missing memory gone?
In Java you typically have multiple regions (and sub regions) in the heap. You have a young and tenured region with most collectors. Large arrays are added to the tenured area straight away however based on your maximum memory size, some space will be reserved for the young space. If you allocate memory slowly these regions will resize however a large block like this can simply fail as you have seen.
Given memory is usually relatively cheap (not always the case) I would just increase the maximum to the point where you would want the application fail if it ever used that much.
BTW: If you have a large structure like this you might consider using direct memory.
IntBuffer array = ByteBuffer.allocateDirect(200*1000*1000*4)
.order(ByteOrder.nativeOrder()).asIntBuffer();
int a = array.get(n);
array.put(n, a+1);
Its a bit tedious to write but has one big advantage, it uses almost no heap. (there is less than 1 KB over head)
There is enough memory available but not as a single continuous
block of memory, as needed for an array.
Can you use a different data structure that uses smaller blocks
of memory, or several smaller arrays?
For example, the following code does work with -Xmx1G:
public class Biggy {
public static void main(String[] args) {
int [][]array = new int[200][];
for (int i = 0; i < 200; i++) {
array[i] = new int[1000 * 1000];
System.out.println("i=" + i);
}
}
}
Heap memory is divided between three spaces:
Old Generation
Survivor Space
Eden Space
At start this object will live in the old generation and will remain here for a while.
By default, the virtual machine grows or shrinks the heap at each collection to try to keep the proportion of free space to live objects at each collection within a specific range. This target range is set as a percentage by the parameters -XX:MinHeapFreeRatio= and -XX:MaxHeapFreeRatio=, and the total size is bounded below by -Xms and above by -Xmx.
Default ratio in my jvm is 30/70 so max size of object in old generation is limited (with -Xmx1G) by 700Mb(btw, I'm getting the same exception when running with default jvm parameters).
However you could size generations using jvm options. For example you could
run your class with parameters -Xmx1G -XX:NewRatio=10 and new int[200 * 1000 * 1000]; will succeed.
From what I could say Java wasn't designed to hold large objects in memory. Typical usage of memory in application is graph of bunch of relatively small objects and typically you'll get OutOfMemoryError only if you run out of space in all of spaces.
Below are couple useful (and interesting to read) articles:
Ergonomics in the 5.0 Java[tm] Virtual Machine
Tuning Garbage Collection with the 5.0 Java[tm] Virtual Machine

How to make OutOfMemoryError occur on Linux JVM 64bit

in my unit test I deliberately trying to raise an OutOfMemoryError exception. I use a simple statement like the following:
byte[] block = new byte[128 * 1024 * 1024 * 1024];
The code works on Win7 64bit with jdk6u21 64bit. But when I run this on Centos 5 64bit with jdk6u21 no OutOfMemoryError thrown, even when I make the size of the array bigger.
Any idea?
Linux doesn't always allocate you all the memory you ask for immediately, since many real applications ask for more than they need. This is called overcommit (it also means sometimes it guesses wrong, and the dreaded OOM killer strikes).
For your unittest, I would just throw OutOfMemoryError manually.
If you just want to consume all the memory do the following:
try {
List<Object> tempList = new ArrayList<Object>();
while (true) {
tempList.add(new byte[128 * 1024 * 1024 * 1024]);
}
} catch (OutOfMemoryError OME) {
// OK, Garbage Collector will have run now...
}
128*1024*1024*1024=0 because int is 32-bit. Java doesn't support arrays larger than 4Gb.
ulimit -v 102400
ulimit -d 102400
unitTest.sh
The above should limit your unit test to 1M of virtual memory, and 1M data segment size. When you reach either of those, your process should get ENOMEM. Careful, these restrictions apply for the process / shell where you called them exits; you might want to run them in a subshell.
man 2 setrlimit for details on how that works under the hood. help ulimit for the ulimit command.
You could deliberately set the maximum heap size of your JVM to a small amount by using the -Xmx flag.
Launch the following program:
public final class Test {
public static void main(final String[] args) {
final byte[] block = new byte[Integer.MAX_VALUE];
}
}
with the following JVM argument: -Xmx8m
That will do the trick:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at Test.main(Test.java:4)
Minor point but allocating new long[Integer.MAX_VALUE] will use up memory 8x faster. (~16 GB each)
The reason for no OutofMemoryError is that the memory is being allocated in a uncommitted state, with no page.
If you write a non-zero byte into each 4K of the array, that will then cause the memory to be allocated.

How to measure sum of collected memory of Young Generation?

I'd like to measure memory allocation data from my java application, i.e. the sum of the size of all objects that were allocated. Since object allocation is done in young generation this seems to be the right place.
I know jconsole and I know the JMX beans but I just can't find the right variable... Right at the moment we are parsing the gc log output file but that's quite hard. Ideally we'd like to measure it via JMX...
How can I get this value?
Additional info after comment of Chadwick:
I want to know how much memory my application is using. It's quite a big software running in a JBoss Appserver. Every 4 weeks there is a new release of the software and we need to compare the memory consumption between old and new version. It's not enough to compare the current value of the old generation at a specific time. It's very useful to know how much more / or less memory gets allocated. Since many objects are getting collected in the young generation I need to measure it there.
In the meantime I have an estimate for this. I will post it as an answer.
Thanks,
Marcel
You can monitor the Young Generation using the MemoryPool MBeans, and more specifically
http://java.sun.com/j2se/1.5.0/docs/api/java/lang/management/MemoryPoolMXBean.html
See the code examples at http://www.javadocexamples.com/java_source/com/sun/enterprise/admin/mbeans/jvm/MemoryReporter.java.html and http://www.java2s.com/Code/Java/Development-Class/ThisVerboseGCclassdemonstratesthecapabilitytogetthegarbagecollectionstatisticsandmemoryusageremotely.htm
If you are using the Sun JDK, you can simply enable GC logging with -verbose:gc -Xloggc:gc.log and analyze the file. Or, if you need the total amount only occasionally, get the GCViewer by Tagtraum, which computes the number you are looking for.
Yourkit profiler provides a good breakdown on memory usage. There's an evaluation version at http://www.yourkit.com/download/index.jsp
I havent used it personally, but have you tried jmap?
the -heap option prints generation wise heap usage.
its a tool bundled with jdk, so its free and probably does not have lot of overhead.
What about this code, it gives you sum of bytes used in all spaces (eden, survivor, old & perm) which is pretty much is all memory used by your instance of JVM.
public static void main(String[] args) {
for (GarbageCollectorMXBean bean : ManagementFactory.getGarbageCollectorMXBeans()) {
if (bean instanceof com.sun.management.GarbageCollectorMXBean) {
com.sun.management.GarbageCollectorMXBean internal = (com.sun.management.GarbageCollectorMXBean) bean;
Runtime.getRuntime().gc();
GcInfo gcInfo = internal.getLastGcInfo();
for (Map.Entry<java.lang.String,java.lang.management.MemoryUsage> e : gcInfo.getMemoryUsageBeforeGc().entrySet()) {
System.out.println(e.getKey() + ": " + e.getValue());
}
}
}
}
That's my idea of an estimation. It basically returns the value of "number of young collections x size of young generation". It's not perfect but works quite well for my use case.
public long getYoungGenAllocatedMB() {
long youngMaxMemory;
long youngUsedMemory;
long youngCollectionCount;
List<MemoryPoolMXBean> beans = ManagementFactory.getMemoryPoolMXBeans();
for (MemoryPoolMXBean bean : beans) {
if ("Par Eden Space".equals(bean.getName())) {
MemoryUsage usage = bean.getUsage();
youngMaxMemory = usage.getMax();
youngUsedMemory = usage.getUsed();
break;
}
}
List<GarbageCollectorMXBean> gBeans = ManagementFactory.getGarbageCollectorMXBeans();
for (GarbageCollectorMXBean bean : gBeans) {
if ("ParNew".equals(bean.getName())) {
youngCollectionCount = bean.getCollectionCount();
break;
}
}
return (youngCollectionCount * youngMaxMemory + youngUsedMemory) / 1024 / 1024;
}
Thanks to all other posters!
Marcel

Java still uses system memory after deallocation of objects and garbage collection

I am running JVM 1.5.0 (Mac OS X Default), and I am monitoring my Java program in the Activity Monitor. I have the following:
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.util.ArrayList;
import java.util.Date;
public class MemoryTest {
public static void memoryUsage() {
System.out.println(
Runtime.getRuntime().totalMemory() -
Runtime.getRuntime().freeMemory()
);
}
public static void main( String[] args ) throws IOException {
/* create a list */
ArrayList<Date> list = new ArrayList<Date>();
/* fill it with lots of data */
for ( int i = 0; i < 5000000; i++ ) {
list.add( new Date() );
} // systems shows ~164 MB of physical being used
/* clear it */
memoryUsage(); // about 154 MB
list.clear();
list = null;
System.gc();
memoryUsage(); // about 151 KB, garbage collector worked
// system still shows 164 MB of physical being used.
System.out.println("Press enter to end...");
BufferedReader br = new BufferedReader(
new InputStreamReader( System.in )
);
br.readLine();
}
}
So why doesn't the physical memory get freed even though the garbage collector seems to work just fine?
Many JVMs never return memory to the operating system. Whether it does so or not is implementation-specific. For those that don't, the memory limits specified at startup, usually through the -Xmx flag, are the primary means to reserve memory for other applications.
I am having a hard time finding documentation on this subject, but the garbage collector documentation for Sun's Java 5 does address this, suggesting that under the right conditions, the heap will shrink if the correct collector is used—by default, if more that 70% of the heap is free, it will shrink so that only 40% is free. The command line options to control these are -XX:MinHeapFreeRatio and -XX:MaxHeapFreeRatio.
There are several command line options for the JVM which help to tune the size of the heap used by Java.
Everybody knows (or should know) about -Xms and -Xmx, which set the minimum and the maximum size of the heap.
But there is also -XX:MinHeapFreeRatio and -XX:MaxHeapFreeRatio which are the respective limits between which the JVM manages free space. It does this by shrinking the used heap, and it can lower the memory consumption of the program.
You can find more information here:
Sun Java System Application Server Enterprise Edition 8.1 2005Q1 Performance Tuning Guide, Chapter 4
Tuning Garbage Collection Outline, by Pete Freitag
You need to use a JVM-specific profiler to monitor the actual heap space being used by the program as opposed to memory allocated to the JVM.
The JVM is not only reluctant to release heap memory that it allocated, but tends to gobble
up space for different reasons, including just-in-time compilation.
Is the OS perhaps showing the memory which is currently allocated to the program - Even though 150~ MB is allocated it does not mean 150~ MB is in use.

Categories

Resources