I have a Java application running on Linux with Xmx set to 1200M
When I check the process in TOP, I see numbers like:
VIRT = 1412m
RES = 237m
SHR = 58m
Inside the Java application, I am printing the Runtime.getRuntime().totalMemory() every minute and the number there shows:
totalMemory() = 108M
Why such a big difference between the values of RES and totalMemory()?
My understanding (which could be wrong) -- totalMemory() is the memory used right now in the Heap. RES -- actual RAM memory used by the Process.
As an update:
Yes, I would expect the RES > totalMemory() - but the difference here is 237MB - 108MB = 129MB. So if someone asks me, what is the maximum memory that my Java application can use, it should be 1200MB + "something" - question is how to know that "something" .. is it 150MB? 200MB? 300MB?
RES will probably include the size of the shared libraries loaded by the JVM, many of which might also be loaded for other applications (such as the C runtime library) and as such don't really count against the actual memory usage of the JVM, but are considered part of its resident memory image.
Actual interpretation of the results of ps or top are probably better directed to superuser.org or serverfault.org.
totalMemory just is a Max heap size, the Java process contains Java heap and otherthings,for example,permanent area,so the size of RES always is biger than java heap's.
Related
I created a DLL that contains a JNI function which is called from C# code. The function is called many times by the C# application which is a long-running GUI application.
My JNI function calls JNI_GetCreatedJavaVMs. If there are zero VMs, then one is created via function JNI_CreateJavaVM(), otherwise my function calls AttachCurrentThread().
Then my function calls FindClass() and loads a class from a JAR file.
Finally I call a method on that class.
In the java code of the invoked method, I first call method totalMemory() and write the result to a log file. The method then performs some other operations.
Usually method totalMemory() returns a value of around 66 million, i.e. approximately 64 MB, but sometimes it returns around two million, i.e. 2 MB. When totalMemory() is only 2 MB, my java method throws OutOfMemoryError (java heap space) because there isn't enough memory to allocate the object that my java method needs to do.
Note that my java method contains catch (Throwable) so that the C# application can continue. After the OutOfMemoryError, for a subsequent call to my JNI function, totalMemory() will again return a value of 64 MB.
What would cause the total memory allocated to the JVM to drop from 64 MB to 2 MB ?
Note that I am referring to the total memory and not the free memory. I assume that the total memory should not change. I believe that my log file proves my assumption since, as I wrote above, it is almost always 64 MB. It just sometimes drops to 2 MB, hence this question. Why does total memory sometimes drop to 2 MB ?
My (stripped-down) JNI code.
#include <jni.h>
void main()
{
JavaVM *jvm;
JNIEnv *env;
JavaVMInitArgs vm_args;
jint rslt;
jclass cls;
jmethodID mid;
jobject jObj;
JavaVMOption* options = new JavaVMOption[1];
options[0].optionString = "-Djava.class.path=jni_test.jar";
vm_args.version = JNI_VERSION_1_10;
vm_args.nOptions = 1;
vm_args.options = options;
vm_args.ignoreUnrecognized = false;
rslt = JNI_CreateJavaVM(&jvm, (void**) &env, &vm_args);
cls = env->FindClass("jni_test/JniTest0");
mid = env->GetStaticMethodID(cls, "getReply", "()Ljava/lang/String;");
jObj = env->CallStaticObjectMethod(cls, mid);
}
My (stripped-down) java method that is invoked in the above JNI code.
public static String getReply() {
long totalMemory = Runtime.getRuntime().totalMemory();
System.out.println("total memory = " + totalMemory);
String reply = "That is the correct answer! Well done!";
return reply;
}
As far as I understand the JVM memory model:
maxMemory is the maximum amount the JVM can use (as defined by -Xmx)
maxMemory = totalMemory + not allocated memory
totalMemory - freeMemory = used
While its rarely (afaik) seen that the JVM releases memory back to the OS,
its possible and would explain what you are experiencing.
totalMemory should not drop below the value that is defined by -Xms,
so running your jvm with a fixed memory value, might solve your issue.
For example: -Xms64m -Xmx64m
My suspicion appears to have been proven correct. Sometimes error messages can be misleading. I had a gut feeling from the start that the problem was not really a memory problem and like I said, it appears I was correct.
My code uses java logging to write log files in the Windows temporary folder. It turns out that another, non-related process also writes log files to the same folder. In fact it literally writes thousands of files to that folder and those files are never deleted. Once we deleted those files, the problem went away.
So the problem was the fact that the temporary folder was full to bursting with files - none of which were written by my code. Something to check if anyone experiences a similar problem and winds up here due to an Internet search.
I want to calculate the used memory of a JVM heap. I did the following in a sample application.
Set JVM heap size as Xms=200mb and Xmx=200mb.
Did the calculation as follows using Java Runtime APIs. It gave me following output for sample program.
Runtime total memory : 192413696
Runtime max memory : 192413696
Runtime free memory : 39734096
Runtime available memory = (max - total + free) = 39734096
Percentage of used memory = 100*(max-available)/max = 100*(192413696-
39734096)/192413696 = 79.35%
Did another calculation via JMX : java.lang:type=Memory (Using MBean)
It gave me following output for the same program.
Used memory : 127737896
Max memory : 201850880
Percentage of used memory = 100*(used/max) = 100* (127737896/201850880)=
63.28%
Could you please help me with the following ?
What is the reason for the difference between using JMX and Java Run time APIs ?
If I want to know the memory occupied in my JVM heap which is the right approach (point 2 or point 3). My intention is to raise alerts before an out of memory occurs for my JVM.
I have another observation as well. When I use CMS algorithm (with -Xms and -Xms set to 32GB and Occupancy fraction set to 70%) I could see the difference between the free memory calculated using MBeans and java runtime freeMemory(). When I was using G1 I could not find these difference (the Mbeans and run time API gave same value).
Runtime rt = Runtime.getRuntime();
long mem = rt.totalMemory();
long fm = rt.freeMemory();
long mm = rt.maxMemory();
long used = mm - fm;
The calculation should be usedMemory / MaxMemory * 100
Used Memory = 127737896
Max Memory = 201850880
127737896 / 201850880 * 100 = 63.28%
I write a piece of java code to create 500K small files (average 40K each) on CentOS. The original code is like this:
package MyTest;
import java.io.*;
public class SimpleWriter {
public static void main(String[] args) {
String dir = args[0];
int fileCount = Integer.parseInt(args[1]);
String content="##$% SDBSDGSDF ASGSDFFSAGDHFSDSAWE^#$^HNFSGQW%##&$%^J#%##^$#UHRGSDSDNDFE$T##$UERDFASGWQR!#%!#^$##YEGEQW%!#%!!GSDHWET!^";
StringBuilder sb = new StringBuilder();
int count = 40 * 1024 / content.length();
int remainder = (40 * 1024) % content.length();
for (int i=0; i < count; i++)
{
sb.append(content);
}
if (remainder > 0)
{
sb.append(content.substring(0, remainder));
}
byte[] buf = sb.toString().getBytes();
for (int j=0; j < fileCount; j++)
{
String path = String.format("%s%sTestFile_%d.txt", dir, File.separator, j);
try{
BufferedOutputStream fs = new BufferedOutputStream(new FileOutputStream(path));
fs.write(buf);
fs.close();
}
catch(FileNotFoundException fe)
{
System.out.printf("Hit filenot found exception %s", fe.getMessage());
}
catch(IOException ie)
{
System.out.printf("Hit IO exception %s", ie.getMessage());
}
}
}
}
You can run this by issue following command:
java -jar SimpleWriter.jar my_test_dir 500000
I thought this is a simple code, but then I realize that this code is using up to 14G of memory. I know that because when I use free -m to check the memory, the free memory kept dropping, until my 15G memory VM only had 70 MB free memory left. I compiled this using Eclipse, and I compile this against JDK 1.6 and then JDK1.7. The result is the same. The funny thing is that, if I comment out fs.write(), just open and close the stream, the memory stabilized at certain point. Once I put fs.write() back, the memory allocation just go wild. 500K 40KB files is about 20G. It seems Java's stream writer never deallocate its buffer during the operation.
I once thought java GC does not have time to clean. But this make no sense since I closed the file stream for every file. I even transfer my code into C#, and running under windows, the same code producing 500K 40KB files with memory stable at certain point, not taking 14G as under CentOS. At least C#'s behavior is what I expected, but I could not believe Java perform this way. I asked my colleague who were experienced in java. They could not see anything wrong in code, but could not explain why this happened. And they admit nobody had tried to create 500K file in a loop without stop.
I also searched online and everybody says that the only thing need to pay attention to, is close the stream, which I did.
Can anyone help me to figure out what's wrong?
Can anybody also try this and tell me what you see?
BTW, some people in this community tried the code on Windows and it seemed to worked fine. I didn't tried it on windows. I only tried in Linux as I thought that where people use Java for. So, it seems this issue happened on Linux).
I also did the following to limit the JVM heap, but it take no effects
java -Xmx2048m -jar SimpleWriter.jar my_test_dir 500000
I tried to test your prog on Win XP, JDK 1.7.25. Immediately got OutOfMemoryExceptions.
While debugging, with only 3000 count (args[1]), the count variable from this code:
int count = 40 * 1024 * 1024 / content.length();
int remainder = (40 * 1024 * 1024) % content.length();
for (int i = 0; i < count; i++) {
sb.append(content);
}
count is 355449. So the String you are trying to create will be 355449 * contents long, or as you calculated, 40Mb long. I was out of memory when i was 266587, and sb was 31457266 chars long. At which point each file I get is 30Mb.
The problem does not seem with memory or GC, but with the way you crate the string.
Did you see files created or was memory eating up before any file was created?
I think your main problem is the line:
int count = 40 * 1024 * 1024 / content.length();
should be:
int count = 40 * 1024 / content.length();
to create 40K, not 40Mb files.
[Edit2: The original answer is left in italics at the end of this post]
After your clarifications in the comments, I have run your code on a windows machine (Java 1.6) and here is my findings (numbers are from VisualVM, OS memory as seen from task manager):
Example with 40K size, writing to 500K files (no parameters to JVM):
Used Heap: ~4M, Total Heap: 16M, OS memory: ~16M
Example with 40M size, writing to 500 files (parameters to JVM -Xms128m -Xmx512m. Without parameters I get an OutOfMemory error when creating StringBuilder):
Used Heap: ~265M, Heap size: ~365M, OS memory: ~365M
Especially from the second example you can see that my original explanation still stands. Yes someone would expect that most of the memory would be freed since the byte[] of the BufferedOutputStream reside in the first generation space (short lived objects) but this a) does not happen immediately and b) when GC decides to kicks in (it actually does in my case), yes it will try to clear memory but it can clear as much memory as it sees fit, not necessarily all of it. GC does not provide any guarentees that you can count upon.
So generally speaking you should give to JVM as much memory you feel comfortable with. If you need to keep the memory low for special functionalities you should try a strategy as the code example I gave down below in my original answer i.e. just don't create all those byte[] objects.
Now in your case with CentOS, it does seem that JVM's behaves strangely. Perhaps we could talk about a buggy or bad implementation. To classify it as a leak/bug though you should try to use -Xmx to restrict the heap. Also please try what Peter Lawrey suggested to not create the BufferedOutputStream at all (in the small file case) since you just write all the bytes at once.
If it still exceeds the memory limit then you have encountered a leak and should probably file a bug. (You could still complain though and they may optimize it in the future).
[Edit1: The answer below assumed that the OP's code performed as many reading operations as the write operations, so the memory usage was justifiable. The OP clarified this is not the case, so his question is not answered
"...my 15G memory VM..."
If you give the JVM as much memory why it should try to run GC? As far as the JVM is concerned it is allowed to get as much memory from the system and run GC only when it thinks that is appropriate to do so.
Each execution of BufferedOutputStream will allocate a buffer of 8K size by default. JVM will try to reclaim that memory only when it needs to. This is the expected behaviour.
Do not confuse the memory that you see as free from the system's point of view and from the JVM's point of view. As far the system is concerned the memory is allocated and will be released when the JVM shuts down. As far the JVM's is concerned all the byte[] arrays allocated from BufferedOutputStream are not in use any more, it is "free" memory and will be reclaimed if it needs to.
If for some reason you don't desire this behaviour you could try the following: Extend the BufferedOutputStream class (e.g. create a ReusableBufferedOutputStream class) and add a new method e.g. reUseWithStream(OutputStream os). This method would then clear the internal byte[], flush and close the previous stream, reset any variables used and set the new stream. Your code then would become as below:
// intialize once
ReusableBufferedOutputStream fs = new ReusableBufferedOutputStream();
for (int i=0; i < fileCount; i ++)
{
String path = String.format("%s%sTestFile_%d.txt", dir, File.separator, i);
//set the new stream to be buffered and read
fs.reUseWithStream(new FileOutputStream(path));
fs.write(this._buf, 0, this._buf.length); // this._buf was allocated once, 40K long contain text
}
fs.close(); // Close the stream after we are done
Using the above approach you will avoid creating many byte[]. However I don't see any problem with the expected behaviour neither you mention any problem other than "I see it takes too much memory". You have congifured it to use it after all.]
I'm writing a program in java which has to make use of a large hash-table, the bigger the hash-table can be, the better (It's a chess program :P). Basically as part of my hash table I have an array of "long[]", an array of "short[]", and two arrays of "byte[]". All of them should be the same size. When I set my table size to ten-million however, it crashes and says "java heap out of memory". This makes no sense to me. Here's how I see it:
1 Long + 1 Short + 2 Bytes = 12 bytes
x 10,000,000 = 120,000,000 bytes
/ 1024 = 117187.5 kB
/ 1024 = 114.4 Mb
Now, 114 Mb of RAM doesn't seem like too much to me. In total my CPU has 4Gb of RAM on my mac, and I have an app called FreeMemory which shows how much RAM I have free and it's around 2Gb while running this program. Also, I set the java preferences like -Xmx1024m, so java should be able to use up to a gig of memory. So why won't it let me allocate just 114Mb?
You predicted that it should use 114 MB and if I run this (on a windows box with 4 GB)
public static void main(String... args) {
long used1 = memoryUsed();
int Hash_TABLE_SIZE = 10000000;
long[] pos = new long[Hash_TABLE_SIZE];
short[] vals = new short[Hash_TABLE_SIZE];
byte[] depths = new byte[Hash_TABLE_SIZE];
byte[] flags = new byte[Hash_TABLE_SIZE];
long used2 = memoryUsed() - used1;
System.out.printf("%,d MB used%n", used2 / 1024 / 1024);
}
private static long memoryUsed() {
return Runtime.getRuntime().totalMemory() - Runtime.getRuntime().freeMemory();
}
prints
114 MB used
I suspect you are doing something else which is the cause of your problem.
I am using Oracle HotSpot Java 7 update 10
Has not taken into account that each object is a reference and also use memory, and more "hidden things"... we must also take into account also the alignment... byte is not always a byte ;-)
Java Objects Memory Structure
How much memory is used by Java
To see how much memory is really in use, you can use a profiler:
visualvm
If you are using standard HashMap (or similar from JDK), each "long" (boxing/unboxing) really are more than 8bytes), you can use this as a base... (use less memory)
NativeIntHashMap
From what I have read about BlueJ, and serious technical information is almost impossible to find, BlueJ VM is quite likely not to support primitive types at all; your arrays are actually of boxed primitives. BlueJ uses a subset of all Java features, with emphasis on object orientation.
If that is the case, plus taking into consideration that performance and efficiency are quite low on BlueJ VM's list of priorities, you may actually be using quite a bit more memory than you think: a whole order of magnitude is quite imaginable.
I believe one way it would be to clean the heap memory after each execution, one link is here:
Java heap space out of memory
Are there any way to get the size of the total memory on the operating system from java? Using
Runtime.getRuntime().maxMemory()
returns the allowed memory for the JVM, not of the operating system. Does anyone have a way to obtain this (from java code)?
com.sun.management.OperatingSystemMXBean bean =
(com.sun.management.OperatingSystemMXBean)
java.lang.management.ManagementFactory.getOperatingSystemMXBean();
long max = bean.getTotalPhysicalMemorySize();
returns available RAM size for JVM (limited by 32bit), not the heap size.
There is no Java-only way to get that information. You may use Runtime.exec() to start OS-specific commands, e.g. /usr/bin/free on Linux. Still on Linux systems, you can use Java file access classes (FileInputStream) to parse /proc/meminfo.
that is not possible with pure Java, your program runs on java virtual machine, and therefore it is isolated from OS. I suggest 2 solutions for this:
1) You can use a JNI and call a C++ function to do that
2) Another option is to use Runtime.exec(). Then you have to get the info from "cat /proc/meminfo"
You can get the RAM usage with this. This is the same value that taskmanager in windows shows
com.sun.management.OperatingSystemMXBean bean = (com.sun.management.OperatingSystemMXBean)java.lang.management.ManagementFactory.getOperatingSystemMXBean();
double percentage = ((double)bean.getFreeMemorySize() / (double)bean.getTotalPhysicalMemorySize()) * 100;
percentage = 100 - percentage;
System.out.println("RAM Usage: " + percentage + "%");