Are there any way to get the size of the total memory on the operating system from java? Using
Runtime.getRuntime().maxMemory()
returns the allowed memory for the JVM, not of the operating system. Does anyone have a way to obtain this (from java code)?
com.sun.management.OperatingSystemMXBean bean =
(com.sun.management.OperatingSystemMXBean)
java.lang.management.ManagementFactory.getOperatingSystemMXBean();
long max = bean.getTotalPhysicalMemorySize();
returns available RAM size for JVM (limited by 32bit), not the heap size.
There is no Java-only way to get that information. You may use Runtime.exec() to start OS-specific commands, e.g. /usr/bin/free on Linux. Still on Linux systems, you can use Java file access classes (FileInputStream) to parse /proc/meminfo.
that is not possible with pure Java, your program runs on java virtual machine, and therefore it is isolated from OS. I suggest 2 solutions for this:
1) You can use a JNI and call a C++ function to do that
2) Another option is to use Runtime.exec(). Then you have to get the info from "cat /proc/meminfo"
You can get the RAM usage with this. This is the same value that taskmanager in windows shows
com.sun.management.OperatingSystemMXBean bean = (com.sun.management.OperatingSystemMXBean)java.lang.management.ManagementFactory.getOperatingSystemMXBean();
double percentage = ((double)bean.getFreeMemorySize() / (double)bean.getTotalPhysicalMemorySize()) * 100;
percentage = 100 - percentage;
System.out.println("RAM Usage: " + percentage + "%");
Related
When trying to benchmark a specific method, in regards to how many objects are created and how many bytes they occupy while that method is running, in Android it is possible to do this:
Debug.resetThreadAllocCount()
Debug.resetThreadAllocSize()
Debug.startAllocCounting()
benchmarkMethod()
Debug.stopAllocCounting()
var memoryAllocCount = Debug.getThreadAllocCount()
var memoryAllocSize = Debug.getThreadAllocSize()
I would now like to benchmark the same method, but on a normal desktop application, where these methods are not available. I have not found anything similar, and any other memory benchmarking code I have tried does not provide consistent results, like the above code does, which gives the exact same result every time when the same benchmark runs.
Any suggestion, preferably just code would be appreciated, however I would be open to try some software as well if it is able to perform the task I am trying to do.
ThreadMXBean.getThreadAllocatedBytes can help:
com.sun.management.ThreadMXBean bean =
(com.sun.management.ThreadMXBean) ManagementFactory.getThreadMXBean();
long currentThreadId = Thread.currentThread().getId();
long before = bean.getThreadAllocatedBytes(currentThreadId);
allocatingMethod();
long after = bean.getThreadAllocatedBytes(currentThreadId);
System.out.println("Allocated " + (after - before) + " bytes");
The method returns an approximation of the total allocated memory, but this approximation is usually quite precise.
Also, async-profiler has Java API for profiling allocations. It does not only count how many objects are allocated, but also shows the exact allocated objects with the stack traces of the allocation sites.
public static void main(String[] args) throws Exception {
AsyncProfiler profiler = AsyncProfiler.getInstance();
// Dry run to skip allocations caused by AsyncProfiler initialization
profiler.start("alloc", 0);
profiler.stop();
// Real profiling session
profiler.start("alloc", 0);
allocatingMethod();
profiler.stop();
profiler.execute("file=alloc.svg"); // save the output to alloc.svg
}
How to run:
java -Djava.library.path=/path/to/async-profiler -XX:+UseG1GC -XX:-UseTLAB Main
-XX:+UseG1GC -XX:-UseTLAB options are needed to record all allocations. Otherwise, async-profiler will work in sampling mode, recording only a small portion of allocations.
Here is how the output will look like:
I want to calculate the used memory of a JVM heap. I did the following in a sample application.
Set JVM heap size as Xms=200mb and Xmx=200mb.
Did the calculation as follows using Java Runtime APIs. It gave me following output for sample program.
Runtime total memory : 192413696
Runtime max memory : 192413696
Runtime free memory : 39734096
Runtime available memory = (max - total + free) = 39734096
Percentage of used memory = 100*(max-available)/max = 100*(192413696-
39734096)/192413696 = 79.35%
Did another calculation via JMX : java.lang:type=Memory (Using MBean)
It gave me following output for the same program.
Used memory : 127737896
Max memory : 201850880
Percentage of used memory = 100*(used/max) = 100* (127737896/201850880)=
63.28%
Could you please help me with the following ?
What is the reason for the difference between using JMX and Java Run time APIs ?
If I want to know the memory occupied in my JVM heap which is the right approach (point 2 or point 3). My intention is to raise alerts before an out of memory occurs for my JVM.
I have another observation as well. When I use CMS algorithm (with -Xms and -Xms set to 32GB and Occupancy fraction set to 70%) I could see the difference between the free memory calculated using MBeans and java runtime freeMemory(). When I was using G1 I could not find these difference (the Mbeans and run time API gave same value).
Runtime rt = Runtime.getRuntime();
long mem = rt.totalMemory();
long fm = rt.freeMemory();
long mm = rt.maxMemory();
long used = mm - fm;
The calculation should be usedMemory / MaxMemory * 100
Used Memory = 127737896
Max Memory = 201850880
127737896 / 201850880 * 100 = 63.28%
I'm writing a program in java which has to make use of a large hash-table, the bigger the hash-table can be, the better (It's a chess program :P). Basically as part of my hash table I have an array of "long[]", an array of "short[]", and two arrays of "byte[]". All of them should be the same size. When I set my table size to ten-million however, it crashes and says "java heap out of memory". This makes no sense to me. Here's how I see it:
1 Long + 1 Short + 2 Bytes = 12 bytes
x 10,000,000 = 120,000,000 bytes
/ 1024 = 117187.5 kB
/ 1024 = 114.4 Mb
Now, 114 Mb of RAM doesn't seem like too much to me. In total my CPU has 4Gb of RAM on my mac, and I have an app called FreeMemory which shows how much RAM I have free and it's around 2Gb while running this program. Also, I set the java preferences like -Xmx1024m, so java should be able to use up to a gig of memory. So why won't it let me allocate just 114Mb?
You predicted that it should use 114 MB and if I run this (on a windows box with 4 GB)
public static void main(String... args) {
long used1 = memoryUsed();
int Hash_TABLE_SIZE = 10000000;
long[] pos = new long[Hash_TABLE_SIZE];
short[] vals = new short[Hash_TABLE_SIZE];
byte[] depths = new byte[Hash_TABLE_SIZE];
byte[] flags = new byte[Hash_TABLE_SIZE];
long used2 = memoryUsed() - used1;
System.out.printf("%,d MB used%n", used2 / 1024 / 1024);
}
private static long memoryUsed() {
return Runtime.getRuntime().totalMemory() - Runtime.getRuntime().freeMemory();
}
prints
114 MB used
I suspect you are doing something else which is the cause of your problem.
I am using Oracle HotSpot Java 7 update 10
Has not taken into account that each object is a reference and also use memory, and more "hidden things"... we must also take into account also the alignment... byte is not always a byte ;-)
Java Objects Memory Structure
How much memory is used by Java
To see how much memory is really in use, you can use a profiler:
visualvm
If you are using standard HashMap (or similar from JDK), each "long" (boxing/unboxing) really are more than 8bytes), you can use this as a base... (use less memory)
NativeIntHashMap
From what I have read about BlueJ, and serious technical information is almost impossible to find, BlueJ VM is quite likely not to support primitive types at all; your arrays are actually of boxed primitives. BlueJ uses a subset of all Java features, with emphasis on object orientation.
If that is the case, plus taking into consideration that performance and efficiency are quite low on BlueJ VM's list of priorities, you may actually be using quite a bit more memory than you think: a whole order of magnitude is quite imaginable.
I believe one way it would be to clean the heap memory after each execution, one link is here:
Java heap space out of memory
I have a Java application running on Linux with Xmx set to 1200M
When I check the process in TOP, I see numbers like:
VIRT = 1412m
RES = 237m
SHR = 58m
Inside the Java application, I am printing the Runtime.getRuntime().totalMemory() every minute and the number there shows:
totalMemory() = 108M
Why such a big difference between the values of RES and totalMemory()?
My understanding (which could be wrong) -- totalMemory() is the memory used right now in the Heap. RES -- actual RAM memory used by the Process.
As an update:
Yes, I would expect the RES > totalMemory() - but the difference here is 237MB - 108MB = 129MB. So if someone asks me, what is the maximum memory that my Java application can use, it should be 1200MB + "something" - question is how to know that "something" .. is it 150MB? 200MB? 300MB?
RES will probably include the size of the shared libraries loaded by the JVM, many of which might also be loaded for other applications (such as the C runtime library) and as such don't really count against the actual memory usage of the JVM, but are considered part of its resident memory image.
Actual interpretation of the results of ps or top are probably better directed to superuser.org or serverfault.org.
totalMemory just is a Max heap size, the Java process contains Java heap and otherthings,for example,permanent area,so the size of RES always is biger than java heap's.
I run into the following errors when i try to store a large file into a string.
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:2882)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:100)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:515)
at java.lang.StringBuffer.append(StringBuffer.java:306)
at rdr2str.ReaderToString.main(ReaderToString.java:52)
As is evident, i am running out of heap space. Basically my pgm looks like something like this.
FileReader fr = new FileReader(<filepath>);
sb = new StringBuffer();
char[] b = new char[BLKSIZ];
while ((n = fr.read(b)) > 0)
sb.append(b, 0, n);
fileString = sb.toString();
Can someone suggest me why i am running into heap space error? Thanks.
You are running out of memory because the way you've written your program, it requires storing the entire, arbitrarily large file in memory. You have 2 options:
You can increase the memory by passing command line switches to the JVM:
java -Xms<initial heap size> -Xmx<maximum heap size>
You can rewrite your logic so that it deals with the file data as it streams in, thereby keeping your program's memory footprint low.
I recommend the second option. It's more work but it's the right way to go.
EDIT: To determine your system's defaults for initial and max heap size, you can use this code snippet (which I stole from a JavaRanch thread):
public class HeapSize {
public static void main(String[] args){
long kb = 1024;
long heapSize = Runtime.getRuntime().totalMemory();
long maxHeapSize = Runtime.getRuntime().maxMemory();
System.out.println("Heap Size (KB): " + heapSize/1024);
System.out.println("Max Heap Size (KB): " + maxHeapSize/1024);
}
}
You allocate a small StringBuffer that gets longer and longer. Preallocate according to file size, and you will also be a LOT faster.
Note that java is Unicode, the string likely not, so you use... twice the size in memory.
Depending on VM (32 bit? 64 bit?) and the limits set (http://www.devx.com/tips/Tip/14688) you may simply not have enough memory available. How large is the file actually?
In the OP, your program is aborting while the StringBuffer is being expanded. You should preallocate that to the size you need or at least close to it. When StringBuffer must expand it needs RAM for the original capacity and the new capacity. As TomTom said too, your file is likely 8-bit characters so will be converted to 16-bit unicode in memory so it will double in size.
The program has not even encountered yet the next doubling - that is StringBuffer.toString() in Java 6 will allocate a new String and the internal char[] will be copied again (in some earlier versions of Java this was not the case). At the time of this copy you will need double the heap space - so at that moment at least 4 times what your actual files size is (30MB * 2 for byte->unicode, then 60MB * 2 for toString() call = 120MB). Once this method is finished GC will clean up the temporary classes.
If you cannot increase the heap space for your program you will have some difficulty. You cannot take the "easy" route and just return a String. You can try to do this incrementally so that you do not need to worry about the file size (one of the best solutions).
Look at your web service code in the client. It may provide a way to use a different class other than String - perhaps a java.io.Reader, java.lang.CharSequence, or a special interface, like the SAX related org.xml.sax.InputSource. Each of these can be used to build an implementation class that reads from your file in chunks as the callers needs it instead of loading the whole file at once.
For instance, if your web service handling routes can take a CharSequence then (if they are written well) you can create a special handler to return just one character at a time from the file - but buffer the input. See this similar question: How to deal with big strings and limited memory.
Kris has the answer to your problem.
You could also look at java commons fileutils' readFileToString which may be a bit more efficient.
Although this might not solve your problem, some small things you can do to make your code a bit better:
create your StringBuffer with an initial capacity the size of the String you are reading
close your filereader at the end: fr.close();
By default, Java starts with a very small maximum heap (64M on Windows at least). Is it possible you are trying to read a file that is too large?
If so you can increase the heap with the JVM parameter -Xmx256M (to set maximum heap to 256 MB)
I tried running a slightly modified version of your code:
public static void main(String[] args) throws Exception{
FileReader fr = new FileReader("<filepath>");
StringBuffer sb = new StringBuffer();
char[] b = new char[1000];
int n = 0;
while ((n = fr.read(b)) > 0)
sb.append(b, 0, n);
String fileString = sb.toString();
System.out.println(fileString);
}
on a small file (2 KB) and it worked as expected. You will need to set the JVM parameter.
Trying to read an arbitrarily large file into main memory in an application is bad design. Period. No amount of JVM settings adjustments/etc... are going to fix the core issue here. I recommend that you take a break and do some googling and reading about how to process streams in java - here's a good tutorial and here's another good tutorial to get you started.