Java app that uses a lot of memory. Use -Xmx? - java

I have a java app that uses about 15G on a machine with 16G. I don't know if I should set the max heap size.
If set will the jvm eat all the ram up to the limit and then start garbage collecting and stop everything while it churns through 15G of heap objects?
If not will the jvm hurt performance by not using all of the available ram on the machine.
My specific vm is: Java HotSpot(TM) 64-Bit Server VM (build 1.6.0_03-b05, mixed mode).
Thanks

-Xmx15G will set the maximum heap size to 15 gig. Java will only allocate what it needs as it runs. If you don't set it, it will only use the default. For info on the default, see this post.
-Xms15G sets the minimum heap to 15 gig. This forces java to allocate 15 gig of heap space before it starts executing, whether it needs it or not.
Usually you can set them both to appropriate values depending on how you're tuning the JVM.

In Java 6, the default maximum heap size is determined by the amount of system memory present.
According to the Garbage Collector Ergonomics page, the maximum heap size is:
Smaller of 1/4th of the physical
memory or 1GB. Before J2SE 5.0, the
default maximum heap size was 64MB.
By using the -Xmx switch can be used to change the maximum heap size. See the java - the Java application launcher documentation for usage details.

If you don't set a max heap size (with -Xmx), isn't the default maximum only 64MB?
So won't your application fail with OutOfMemoryErrors if you don't set it? I'm confused on this question. How can your application run without this switch?

Related

Why default java max heap is 1/4th of Physical memory?

I have read couples of articles on java heap space and found out that the default max heap for JVM is 1/4th of the actual physical space. But none of the article had reason for this ?
Whats the reason of having it as 1/4th of actual memory?
https://docs.oracle.com/javase/8/docs/technotes/guides/vm/gc-ergonomics.html
This dates back to JDK 5, which introduced JVM ergonomics. Prior to this, the JVM would set very small defaults for the heap space. JDK 1.1 had a default of 16Mb for both Xms and Xmx, JDK 1.2 changed this to Xms of 1Mb and Xmx of 64Mb by default. In JDK 1.3, Xms default increased to 2Mb.
Since Java was proving more popular on servers and memory capacities were increasing significantly, Sun introduced the concept of a server-class machine in JDK 5. This is one that has 2 or more physical processors and 2 or more Gb of memory (if I remember rightly, in JDK 5, the machine also had to not be running Windows to count as a server).
On server-class machines by default, the following parameters were set
Throughput garbage collector (i.e. the parallel collector)
initial heap size of 1/64 of physical memory up to 1Gbyte
maximum heap size of 1/4 of physical memory up to 1Gbyte
Server runtime compiler
Ergonomics provided two command-line flags that allowed a user to set a performance goal for the JVM; the idea being that the JVM would then figure out internally how to achieve this goal by modifying its parameters. The ultimate goal was to eliminate a lot of the -XX flags that were being used to tune JVM performance manually.
The parameters are:
-XX:MaxGCPauseMillis=nnn which sets the maximum pause time you want for GC in milliseconds.
-XX:GCTimeRatio= which sets the ratio of garbage collection time to application time being 1 / (1 + nnn). This was referred to as the throughput goal.
You can specify either of these goals or both. If the JVM manages to achieve both of these goals it then attempts to reduce the memory being used (the footprint goal).
There's more detail here:
https://www.oracle.com/technetwork/java/ergo5-140223.html

Finding Memory Requirements of Java Application

We have a desktop Java Swing application. For shipping it, we need to specify the minimum memory requirements for deploying this application. In the JVM parameters we specify 2GB as max heap size.
Is there any tool for a Windows based machine which can quantify the requirements?
Also, as follow-up question, I would like to know: If we do not specify the max heap size in Java 7, does the JVM still automatically adjust the heap size on the fly before throwing an OutOfMemoryError?
Possible approach:
If you specify your own product to work with at max. 2GB of heap, you also have to consider the other parts of memory, allocated within the Java virtual machine:
To find out your memory consumption, I suggest you to test your application with MemoryMXBean. This includes methods such as getHeapMemoryUsage() and getNonHeapMemoryUsage().
Then stress-test your applications and periodically check these properties. This way you should get a feeling for how much memory your application consumes.
Additionally to that, Windows specifies 2GB as minimum RAM for Windows 10.
So, your final minimum requirements should be Minimum = MaximumHeap (2GB) + StressTestNonHeap (?) + WindowsMinimum (2GB) + SomeSecurityThreshold (~1GB).
Further approaches:
You could also use VisualVM to check your memory consumption.
Another possibility is to use Java HotSpot Native Memory Tracking (NMT), for which I posted an example on Stack Overflow.
Anything that also informs you about non-heap memory useage is applicable.
Max heap limits:
Regarding your question
Also on another note just wanted to know if we do not specify the max heap limits with Java 7, does the JVM automatically allocates heap on the fly to adjust before throwing out of memory.
If you do not specify the max heap size, the JVM will set it automatically depending on the used GC (in Java 7 this should be UseParallelOldGC) and your system. To test this, run java -XX:+PrintVMOptions -XX:+AggressiveOpts -XX:+UnlockDiagnosticVMOptions -XX:+UnlockExperimentalVMOptions -XX:+PrintFlagsFinal -version and check what values are set for MaxHeapSize and UseParallelOldGC.
GC considerations:
Also: You probably want to consider using the garbage first (G1) GC, which will be the default GC in Java 9. In this question I show that the G1 GC also re-shrinks the heap if it thinks it is pratical. This may be useful if your application has memory-intensive and non-memory-intensive parts. This way, the heap may shrink during the non-memory-intensive parts, which most probably won't happen with the ParallelOldGC.
When you run the JVM without the maximum heap size for the server JVM it uses 1/4 of main memory up to 32 GB. If you use the 32-bit windows client VM, it uses 64MB or 128MB.
The best way to determine the required memory consumption is to test you application with different memory sizes. The minimum memory is the lowest memory size you are willing to support. Only you know what you are comfortable supporting.

will java use more memory when running on machine with larger ram

If I have a smaller-ram machine and a larger-ram machine. I run the same java code on them.
Will jvm do garbage collection more lazily on the machine with larger ram?
The problem I am trying to solve is an out of memory issue. People reported that they have Out of memory issue on small ram machine. I want to test that but the only machine I have now has a much larger ram than theirs. I am wondering if I do the test on this larger-ram machine and keep track of the memory usage, will the memory usage be the same on a smaller-ram machine or it will use even less memory?
Thanks!
Erben
You need to take a look at the JVM memory parameters. actually you can set the as much memory as you want to your JVM :
-Xmx2048m -> this param to set the max memory that the JVM can allocate
-Xms1024m -> the init memory that JVM will allocate on the start up
-XX:MaxPermSize=512M -> this for the max Permanent Generation memory
so in your case you can set the much memory as in the another machine. so you machine will not take more RAM than the Xmx value
and you may want to check this parameters also.
-XX:MaxNewSize= -> this need to be 40% from your Xmx value
-XX:NewSize=614m -> this need to be 40% from your Xmx value
also you may tell you JVM what type of GC to use like :
-XX:+UseConcMarkSweepGC
SO if you set this parameters in the both machines, you will get the same results and the same GC activity most likely.
Yes it will. This depends on the default maximum heap size. You can check your current maximum heap size using this command:
java -XshowSettings:vm
On my wife's laptop (Windows 8.1, 4 GB RAM, 32-Bit-Java-Runtime) it is 247.5 MB, while on my laptop (Windows 7, 8 GB RAM, 64-Bit-Java-Runtime) it is 903.12 MB.
This is determined by Java (see https://stackoverflow.com/a/4667635/3236102, though the values shown there are for server-class-machines, they might be different from normal machines).
If you want your vm to simulate a low-RAM-machine just use the -Xmx flag to limit your machine to less RAM (e.g. -Xmx128m for 128 MB RAM allocation).
The best thing might be to ask the users that encounter the Out Of Memory-issues to check their maximum heap size (using the command above) and set your machine to the same maximum heap size, so you have the same conditions as they have.
The issue can be reproduced with larger RAM.
First you need to get the heap size configuration from the people who reported the issue.
Use the same heap size to reproduce the issue.
Use below jvm params for heap settings.
-Xmx512m Max heap memory that is used to store objects
-XX:MaxPermSize=64m Max perm gen size. This space is used to store meta info like loaded classes etc

What are the default maximum heap sizes for the various Sun JVM's?

One would think it would be easy to find a table that lists the default maximum heap sizes for the different JVM versions...but a quick search didn't find such a thing.
So, what are the default maximum heap sizes for the various Sun JVM's?
I totally see the point in your question, since the JVM defaults differ by vendor and version. I would also be interested in a proper matrix. But all i have is Oracle Java 7 docs says:
The default value is chosen at runtime based on system configuration.
For server deployments, -Xms and -Xmx are often set to the same value.
For more information, see HotSpot Ergonomics
and deeper inside the docs it says:
The following changes take effect with J2SE 5.0. Garbage Collector of
Server VM Changed to Parallel Garbage Collector
On server-class machines running the server VM, the garbage collector
(GC) has changed from the previous serial collector (-XX:+UseSerialGC)
to a parallel collector (-XX:+UseParallelGC). You can override this
default by using the -XX:+UseSerialGC command-line option to the java
command. Initial Heap Size and Maximum Heap Size Changed for Parallel
Garbage Collector
On server-class machines running either VM (client or server) with the
parallel garbage collector (-XX:+UseParallelGC) the initial heap size
and maximum heap size have changed as follows.
Initial heap size: Larger of 1/64th of the machine's physical memory on the machine or some reasonable minimum. Before J2SE 5.0, the default initial heap size was a reasonable minimum, which varies by platform.
Maximum heap size: Smaller of 1/4th of the physical memory or 1GB. Before J2SE 5.0, the default maximum heap size was 64MB.
Note: The boundaries and fractions given for the heap size are correct for J2SE 5.0. They are likely to be different in subsequent releases as computers get more powerful.
Jared, according to Java 1.5 documentation, the default max size is 64MB.
Take a look at -Xmsn and -Xmxn non-standard options.

Why am I able to set -Xmx to a value greater than physical and virtual memory on the machine on both Windows and Solaris?

On a 64-bit Windows machine with 12GB of RAM and 33GB of Virtual Memory (per Task Manager), I'm able to run Java (1.6.0_03-b05) with an impossible -Xmx setting of 3.5TB but it fails with 35TB. What's the logic behind when it works and when it fails? The error at 35TB seems to imply that it's trying to reserve space at startup. Why would it do that for -Xmx (as opposed to -Xms)?
C:\temp>java -Xmx3500g ostest
os.arch=amd64
13781729280 Bytes RAM
C:\temp>java -Xmx35000g ostest
Error occurred during initialization of VM
Could not reserve enough space for object heap
Could not create the Java virtual machine.
On Solaris (4GB RAM, Java 1.5.0_16), I pretty much gave up at 1 PB on how high I can set -Xmx. I don't understand the logic for when it will error out on the -Xmx setting.
devsun1.mgo:/export/home/mgo> java -d64 -Xmx1000000g ostest
os.arch=sparcv9
4294967296 Bytes RAM
At least with the Sun 64-bit VM 1.6.0_17 for Windows, ObjectStartArray::initialize will allocate 1 byte for each 512 bytes of heap on VM startup. Starting the VM with 35TB heap will cause the VM to allocate 70GB immediately and hence fail on your system.
The 32-bit VM (and so I suppose the 64-bit VM) from Sun does not take account for available physical memory when calculating the maximum heap, but is only limited by the 2GB addressable memory on Windows and Linux or 4GB on Solaris or possibly failing to allocate enough memory at startup for the management area.
If you think about it, checking the sanity of the max heap value against available physical memory does not make much sense. X GB of physical memory does not mean that X GB is available to the VM when required, it can just as well have been used by other processes, so the VM needs a way to cope with the situation that more heap is required than available from the OS anyway. If the VM is not broken, OutOfMemoryErrors are thrown if memory cannot be allocated from the OS, just as if the max heap size has been reached.
According to this thread on Sun's java forums (the OP has 16GB of physical memory):
You could specify -Xmx20g, but if the total of the memory needed by all the processes on your machine ever exceeds the physical memory on your machine, you are likely to end up paging. Some applications can survive running in paged memory, but the JVM isn't one of them. Your code might run okay, but, for example, garbage collections will be abysmally slow.
UPDATE: I googled a bit further and, according to the Frequently Asked Questions About the Java HotSpot VM and more precisely How large a heap can I create using a 64-bit VM?
How large a heap can I create using a 64-bit VM?
On 64-bit VMs, you have 64 bits of
addressability to work with resulting
in a maximum Java heap size limited
only by the amount of physical memory
and swap space your system provides.
See also Why can't I get a larger
heap with the 32-bit JVM?
I don't know why you are able to start a JVM with a heap >45GB. This is a bit confusing...
Just to re-enforce Pascal's answer--Be very careful in windows when specifying a high max memory size. I was working on a server project that required as much physical memory as possible, but once you are over physical ram, abysmal performance is not a good description of what happens--Hung machine might be better.
What happens (at least this is my evaluation of it after days of examining logs and re-running tests) is, Windows runs out of ram and asks all apps to free up what they can. When it asks Java, Java kicks off a GC. The GC touches all of memory (causing anything that has been swapped out to be swapped in). This in turn causes windows to run out of memory. Windows then sends a message to all apps asking them to free up what they can.... (recurse indefinitely)
This may not ACTUALLY be what is going on, but the fact that Java GC touches Very Old Memory at times makes it incompatible with paging.

Categories

Resources