Java heap size growing to 10x the used heap - java

I am profiling a Java process with VisualVM and I found out that while my used heap remains constantly below 100 MB, the heap size keeps increasing to a point at which it is 10 times bigger than the used heap!
Reading from the docs:
By default, the virtual machine grows or shrinks the heap at each
collection to try to keep the proportion of free space to live objects
at each collection within a specific range. This target range is set
as a percentage by the parameters -XX:MinHeapFreeRatio= and
-XX:MaxHeapFreeRatio=, and the total size is bounded below by -Xms and above by -Xmx.
So, consindering that MinHeapFreeRatio and MaxHeapFreeRatio are set to 40% and 70% respectively, why is this happening?

Related

Java JVM parameter Xms doesn't take effect immediately

I run my Java Application through tomcat, and set the -Xms1024m, however I found the size of the Java heap just 200~300m after start the application, I think the Xms means the minimum heap size, why the application doesn't reach to the minimum Heap size 1024m immediately after the application startup?
Edit, BTW, the JVM is hotspot 7.0.
It seems the GC does it in the
method HeapRegion::setup_heap_region_size(uintx min_heap_size) from the c++ file \openjdk-7-fcs-src-b147-27_jun_2011\openjdk\hotspot\src\share\vm\gc_implementation\g1\heapRegion.cpp , and method parse_each_vm_init_arg from file \openjdk-7-fcs-src-b147- 27_jun_2011\openjdk\hotspot\src\share\vm\runtime\arguments.cpp , someone who familiar with JVM GC source code can help to do some analysis for it.
It only shows you the space used. The heap size determines the capacity of each region. Note: the actual size available is less as you have two survivors spaces and only one is active in normal operation. e.g. say you have a survivor spaces of 50 MB each, a 1 GB heap will say only 950 MB is available.
When you set the minimum heap size, this doesn't mean that space has been used yet. However you might not get a GC until this heap size is reached (or when the Eden which is a portion of the heap is full)
Say you have a 1 GB heap and the Eden space is 100 MB to start with. You will get a GC once the Eden fills up even though little tenured space is used. Eventually the Eden grows and the tenured space starts to fill and once it is full, it might grow the heap size to greater than the minimum heap size you gave it.

Used vs Max vs Size -Jvisualvm?

In jvisual vm i see three attributes under Monitor>Heap, i see 3 attributes depicting memory details all with differnt figures
Size : ?
Used :- I believe this is the actual memory used
Max :- I believe this is the max heap size allocated to java process (specified with Xmx)
I am not sure what size actually depicts?
The three attributes can be defined as next:
Size: The actual total reserved heap size
Used: The actual used heap size.
Max: The max size of the Java heap (young generation + tenured generation)
Indeed when you launch your JVM, the initial heap size (can be defined with -Xms) will be the initial total reserved heap size, then according to how your application behaves, it could need to increase the total reserved size until it reaches the max size and if it is still not enough you could get OOME.
Size depicts the heap block size assigned to java process. Try with -Xms 512m or 1024m then your size to start with will be 512m but used memory may be much lower. As soon as used memory grows , heap resizing occurs proactively so that memory can be allocated to live objects.
Its like you have Gas tank of 30 litre max capacity . But you know for now you may just need 20 litres for the trip but actually used in trip is 5 litres
Heap size is actual size of heap your running application has.
Used heap is used portion of heap size.
Max heap size is the maximum value the application's heap size can have (can be defined by the arg option -Xmx).
When monitor memory usage of a java application, you see that heap size may vary during running of the application. It can not be greater than max heap size. For a sample profiling (monitoring of an application), see below image:

Impact of heap parameters on GC/performance?

Most of the place on net , I get below info about heap parameters
-Xms<size> set initial Java heap size
-Xmx<size> set maximum Java heap size
Here is mine understanding/question when I mention -Xms 512M -Xmx 2048M parameters ,
-Xms :- My understanding is if my java process is actually needed only 200M , with mention of -Xms 512M , java process will still be assigned only 200M(actual memory required) instead of 500M . But if I already know that my application is going to take this 512M memory on start up then specifying less than will have impact on performance as anyways heap block need to resized which is costly operation.
Per discussion with my colleague, By default GC will trigger on 60% of Xms value. Is that correct ? If yes is it minor GC or full GC that is dependant on Xms value ?
Update on Xms:-
This seems to be true after reading JVM heap parameters but again is value 60% by default and is it minor or full GC that is dependent on Xms value?
-Xmx:- My understanding is with mention of -Xmx 2048M , java process is actually going to reserve 2048M memory for its use from OS itso that another process can not be given its share.If java process needed anyhow more than 2048M memory, then out of memory will be thrown.
Also i believe there is some relation of Full GC trigger on value of -Xmx. Because what I observed is when memory reaches near to 70% of Xmx, Full GC happens in jconsole. Is that correct?
Configuration :- I am using linux box(64 bit JVM 8). Default GC i.e Parallel GC
GC is not triggered based on just Xms or Xmx value.
Heap = New + Old generations
The heap size (which is initially set to Xms) is split into 2 generations - New (aka Young) and Old (aka Tenured). New generation is by default 1/3rd of the total heap size while Old generation is 2/3rd of the heap size. This can be adjusted by using JVM parameter called NewRatio. Its default value is 2.
Young Generation is further divided in Eden and 2 Survivor spaces. The default ratio of these 3 spaces are: 3/4th, 1/8th, 1/8th.
Side note: This is about Parallel GC collectors. For G1 - new GC algorithm divides the heap space differently.
Minor GC
All new objects are allocated in Eden space (except massive ones which are directly stored in Old generation). When Eden space becomes full Minor GC is triggered. Objects which survive multiple minor GCs are promoted to Old Generation (default is 15 cycles which can be changed using JVM parameter: MaxTenuringThreshold).
Major GC
Unlike concurrent collector, where Major GC is triggered based on used-space (by default 70%), parallel collectors calculate threshold based on 3 goals mentioned below.
Parallel Collector Goals
Max GC pause time - Maximum time spent in doing GC
Throughput - Percentage of time spent in GC vs Application. Default (1%)
Footprint - Maximum heap size (Xmx)
Thus by default, Parallel Collector tries to spend maximum 1% of total application running time in Garbage Collection.
More details here
Xms to Xmx
During startup JVM creates heap of size Xms but reserves the extra space (Xmx) to be able to grow later. That reserved space is called Virtual Space. Do note that it just reserves the space and does not commit.
2 parameters decide when heap size grows (or shrinks) between Xms and Xmx.
MinHeapFreeRatio (default: 40%): Once the free heap space dips below 40%, a Full GC is triggered, and the heap size grows by 20%. Thus, heap size can keep growing incrementally until it reaches Xmx.
MaxHeapFreeRatio (default: 70%): On the flip side, heap free space crosses 70%, then Heap size is reduced by 5% incrementally during each GC until it reaches Xms.
These parameters can be set during startup. Read more about it here and here.
PS: JVM GC is fascinating topic and I would recommend reading this excellent article to understand in-depth. All the JVM tuning parameters can be found here.

When does out of memory happen?

Recently, when running our application, we met an out of memory exception.
This is the heap dump right before the exception happened
Heap
def new generation total 1572864K, used 366283K [0x00000006b0000000, 0x000000071aaa0000, 0x000000071aaa0000)
eden space 1398144K, 13% used [0x00000006b0000000, 0x00000006bbb12d40, 0x0000000705560000)
from space 174720K, 100% used [0x0000000710000000, 0x000000071aaa0000, 0x000000071aaa0000)
to space 174720K, 0% used [0x0000000705560000, 0x0000000705560000, 0x0000000710000000)
tenured generation total 3495296K, used 2658714K [0x000000071aaa0000, 0x00000007f0000000, 0x00000007f0000000)
the space 3495296K, 76% used [0x000000071aaa0000, 0x00000007bcf06ba8, 0x00000007bcf06c00, 0x00000007f0000000)
compacting perm gen total 42048K, used 41778K [0x00000007f0000000, 0x00000007f2910000, 0x0000000800000000)
the space 42048K, 99% used [0x00000007f0000000, 0x00000007f28ccb80, 0x00000007f28ccc00, 0x00000007f2910000)
No shared spaces configured.
It looks like old gen was almost full (76%). I assume when it finally reaches 100% OOM happens. However, it looks like eden is only at 13%.
Can someone explain why OOM happens even if there is still some space in young gen?
There is a dozen of different reasons why JVM may throw OutOfMemoryError, including
Java heap space: when trying to allocate an object or an array larger than maximum continuous free block in either of heap generations;
GC overhead limit exceeded: when the proportion of time JVM spends doing garbage collection becomes too high (see GCTimeLimit, GCHeapFreeLimit);
PermGen space (before Java 8) or Metaspace (since Java 8): when the amount of class metadata exceeds MaxPermSize or MaxMetaspaceSize;
Requested array size exceeds VM limit: when trying to allocate an array with length larger than Integer.MAX_VALUE - 2;
Unable to create new native thread: when reaching the OS limit of user processes (see ulimit -u) or when there is not enough virtual memory to reserve space for thread stack;
Direct buffer memory: when the size of all direct ByteBuffers exceeds MaxDirectMemorySize or when there is no virtual memory available to satisfy direct buffer allocation;
When JVM cannot allocate memory for its internal structures, either because run out of available virtual memory or because certain OS limit reached (e.g. maximum number of memory map areas);
When JNI code failed to allocate some native resource;
Etc. Not to mention that an application can throw OutOfMemoryError itself at any time just because a developer decides so.
To find out what is the reason of your particular error, you should at least look at the error message, the stacktrace and GC logs.

Under which circumstance will JVM decide to grow size of heap?

A JVM application runs on Oracle Hotspot JVM, it starts up with default JVM settings, but with 100MB of initial heap size and 1GB of maximum heap size.
Under which circumstances will JVM decide to grow the current heap size, instead of trying GC?
HotSpot JVM continuously monitors allocation rates and objects lifetimes. It tries to achieve two key factors:
let short-lived objects die in eden
promote long-lived object to heap on time to prevent unnecessarily copying between survivor spaces
In a nutshell you can describe it as the HotSpot have some configured threshold which indicates how much pecentage of total allocated heap have to by free after running garbage collector. For example if this threshold is configured for 70% and after running full GC heap usage will be 80%, then additional memory will be allocated to hit the threshold. Of course bigger heap means longer pauses while smaller heap means more frequent collections.
But you have to remember that JVM is very complex, and you can change this behaviour, for example by using flags:
AdaptiveSizePausePolicy, which will pick heap size to achieve shortest pauses
AdaptiveSizeThroughPutPolicy, which will pick heap size to achieve highest throughtput
GCTimeLimit and GCTimeRatio, which sets time spent in application execution
Number of object which occupies the Heap increases while garbage collection is not possible.
When objects not possible to collect as garbage since they are use by current process, JVM need to increase it's heap size towards it is maximum to allow to create new objects.

Categories

Resources