What actually memory overhead is in java? - java

I have read what-is-the-memory-consumption-of-an-object-in-java and what-is-the-memory-overhead-of-an-object-in-java.
But still I am in confusion.
What is memory overhead?? is it the padding?
What is JVM with compressed pointers? is it reference??
If 32-bit JVM is used then overhead will be less? Ofcourse yes.But is it because of padding?
So is it better to use always 32-bit JVM for memory efficiency or for performance?
Below image is from this link(page no.26)
In this image at starting itself they shown as 16 bytes JVM overhead,why that so??

What is memory overhead??
When more memory is used than the fields you created.
is it the padding?
Some is padding which can appear anywhere in the object, except the header which is always at the start. The header is typically 8-12 bytes long.
What is JVM with compressed pointers?
A technique for using 32-bit pointers in a 64-bit JVM to save memory.
is it reference??
References can use this technique but so can pointers to the class information for an object.
If 32-bit JVM is used then overhead will be less?
Possibly, though this is the same as using compressed pointers for references and classes.
But is it because of padding?
It's because 64-bit pointers use more space than 32-bit pointer.
So is it better to use always 32-bit JVM for memory efficiency or for performance?
No, the 32-bit processor model has 32-bit registers where as the 64-bit model has twice as many registers which are double the sized (64-bit) means far more can be held in the fastest memory, the registers. 64-bit calculations tend to be faster as well with a 64-bit processing model.
In general I would recommend you always use the 64-bit JVM unless you a) can't or b) have a very small amount of memory.
In this image at starting itself they shown as 16 bytes JVM overhead,why that so??
This is not strictly correct. This assumes you have a non compressed class reference so the header is 12-bytes, however objects are 8 byte aligned by default, which means there will be 4 bytes of padding at the end (which totals 16 bytes but it's not all at the start)
FAQ: Why can 32-bit Compressed OOP address more than 4 GB
Object have to be 8-byte aligned by default. This makes memory management easier but wastes some padding sometimes. A side effect is that the address of every object will have 000 for the lowest three bits (it has to be a multiple of 8) Those bits don't need to be stored. This allows a compressed oops to address 8 * 4 GB or 32 GB.
With a 16 byte object alignment the JVM can address 64 GB with 32-bit reference (however the padding overhead is higher and might not be worth it)
IFAQ: Why is it slower at around 28 - 32 GB
While the reference can be multiplied by 8, the heap doesn't start at the start of memory. It typically start around 4 GB after. This means that if you want the full 32 GB you have to add this offset, which has a slight overhead.
Heap sizes:
< 4 GB - zero extend address
4 - 28 GB - multiply by 8 or << 3 note: x64 has an instruction for this to support double[] and long[]
28 - 32 GB - multiple by 8 and add a register holding the offset. Slightly slower, but not usually a problem.

Related

Does Direct Memory affect compressed Pointers in Java?

I am aware that once Java heap size grows past 32GB, we lose the benefits of compressed pointers and may have less effective memory (compared to 32GB) until the total heap reaches ~48GB.
Does Direct Memory usage affect the determination to use compressed pointers or not? For example, will I still be able to use them with settings like -Xmx28G -XX:MaxDirectMemorySize=12G?
I am aware that once Java heap size grows past 32GB, we lose the benefits of compressed pointers and may have less effective memory (compared to 32GB) until the total heap reaches ~48GB.
You can increase the object alignment to 16 (in Java) allowing you to use CompressedOops up to 64 GB.
Does Direct Memory usage affect the determination to use compressed pointers or not?
The direct memory is just native memory like the thread stacks, GUI components, shared libraries etc. They are not part of the heap, nor is the meta space.
For example, will I still be able to use them with settings like -Xmx28G -XX:MaxDirectMemorySize=12G
You can have -XX:MaxDirectMemorySize=1024G if you like, this is not part of the heap.

Is there an alternative to AtomicReferenceArray for large amounts of data?

I have a large amount of data that I'm currently storing in an AtomicReferenceArray<X>, and processing from a large number of threads concurrently.
Each element is quite small and I've just got to the point where I'm going to have more than Integer.MAX_VALUE entries. Unfortunately List and arrays in java are limited to Integer.MAX_VALUE (or just less) values. Now I have enough memory to keep a larger structure in memory - with the machine having about 250GB of memory in a 64b VM.
Is there a replacement for AtomicReferenceArray<X> that is indexed by longs? (Otherwise I'm going to have to create my own wrapper that stores several smaller AtomicReferenceArray and maps long accesses to int accesses in the smaller ones.)
Sounds like it is time to use native memory. Having 4+ billion objects is going to cause some dramatic GC pause times. However if you use native memory you can do this with almost no impact on the heap. You can also use memory mapped files to support faster restarts and sharing the data between JVMs.
Not sure what your specific needs are but there are a number of open source data structures which do this like; HugeArray, Chronicle Queue and Chronicle Map You can create an array which 1 TB but uses almost no heap and has no GC impact.
BTW For each object you create, there is a 8 byte reference and a 16 byte header. By using native memory you can save 24 bytes per object e.g. 4 bn * 24 is 96 GB of memory.

Java process size 32 bit vs 64 bit

From this IBM article:
A 32 bit Java process has a 4 GB process address space available shared by the Java Heap, Native Heap and the Operating System.
...
64 bit processes do not have this limit and the address ability is in terabytes. It is common for many enterprise applications to have large java heaps (we have seen applications with java heap requirements of over 100 GB). 64 bit Java allows massive Java heaps (benchmark released with heaps upto 200 GB).
Whats the explanation behind that 64 bit processors have quite large (basically very large) address space and 32 bit do not have. Basically whats happening inside 64 bit that's not inside 32 bit machines.
Whats the explanation behind that 64 bit processors have quite large (basically very large) address space and 32 bit do not have. Basically whats happening inside 64 bit that's not inside 32 bit machines.
Quite simply, there's double the space to store the address, so the value you can store in this space squares.
It may be easier to see this for lesser values; for instance, if I had a 4 bit address space, I could store up to 1111, giving me a maximum of 15 bits of memory. With an 8 bit address space, I could store up to 11111111, giving me 255 (15^2) bits of memory.
Note that this value just denotes the maximum amount of memory you can use, it doesn't actually give you this memory - but if you have more memory than you can address, you have no way of accessing it.
A 32-bit process usually has a 32-bit address space, which limits how much memory can be addressed. (See, for instance, "Why can't I get a larger heap with the 32-bit JVM?") A 64-bit process has a 64-bit address space, which essentially squares the number of addresses available.
with a 32 bit word, you can make about 4 billion different values.
That's 4 billion bytes worth of memory addresses.
with 64 bits, you can represent more values. about (4,000,000,000 ^ 2), which ends up being about 16,000,000,000,000,000,000,

Why do Java objects have to be a multiple of 8?

I know that Java uses padding; objects have to be a multiple of 8 bytes. However, I dont see the purpose of it. What is it used for? What exactly is its main purpose?
Its purpose is alignment, which allows for faster memory access at the cost of some space. If data is unaligned, then the processor needs to do some shifts to access it after loading the memory.
Additionally, garbage collection is simplified (and sped up) the larger the size of the smallest allocation unit.
It's unlikely that Java has a requirement of 8 bytes (except on 64-bit systems), but since 32-bit architectures were the norm when Java was created it's possible that 4-byte alignment is required in the Java standard.
The accepted answer is speculation (but partially correct). Here is the real answer.
First of, to #U2EF1's credit, one of the benefits of 8-byte boundaries is that 8-bytes are the optimal access on most processors. However, there was more to the decision than that.
If you have 32-bit references, you can address up to 2^32 or 4 GB of memory (practically you get less though, more like 3.5 GB). If you have 64-bit references, you can address 2^64, which is terrabytes of memory. However, with 64-bit references, everything tends to slow down and take more space. This is due to the overhead of 32-bit processors dealing with 64-bits, and on all processors more GC cycles due to less space and more garbage collection.
So, the creators took a middle ground and decided on 35-bit references, which allow up to 2^35 or 32 GB of memory and take up less space so to have the same performance benefits of 32-bit references. This is done by taking a 32-bit reference and shifting it left 3 bits when reading, and shifting it right 3 bits when storing references. That means all objects must be aligned on 2^3 boundaries (8 bytes). These are called compressed ordinary object pointers or compressed oops.
Why not 36-bit references for accessing 64 GB of memory? Well, it was a tradeoff. You'd require a lot of wasted space for 16-byte alignments, and as far as I know the vast majority of processors receive no speed benefit from 16-byte alignments as opposed to 8-byte alignments.
Note that the JVM doesn't bother using compressed oops unless the maximum memory is set to be above 4 GB, which it does not by default. You can actually enable them with the -XX:+UsedCompressedOops flag.
This was back in the day of 32-bit VMs to provide the extra available memory on 64-bit systems. As far as I know, there is no limitation with 64-bit VMs.
Source: Java Performance: The Definitive Guide, Chapter 8
Data type sizes in Java are multiples of 8 bits (not bytes) because word sizes in most modern processors are multiples of 8-bits: 16-bits, 32-bits, 64-bits. In this way a field in an object can be made to fit ("aligned") in a word or words and waste as little space as possible, taking advantage of the underlying processor's instructions for operating on word-sized data.

Error while initializing Array:OutOfMemoryError

I have to allocate space to an array int input[] depending on the configuration parameters height and width.
int input[]=new int[height * width]; //this is line no 538
One of the configurations has parameters height=8192 and width=8192. So the size of the array becomes 67108864. But when i do this i get OutOfMemoryError.
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at Test.main(Test.java:538)
I have ran this program on eclipse as well as on cygwin but i am facing the same problem. I think this is not an error and not exception. How can i rectify this?
Since 8192 * 8192 * 4 = 256 M (integers are 4 bytes each), your matrix is using 256 MB of heap space by itself.
You can tell the JVM how much heap space should be available to your application. From running man java and looking through the nonstandard options:
-Xmxn
Specify the maximum size, in bytes, of the memory allocation
pool. This value must a multiple of 1024 greater than 2MB.
Append the letter k or K to indicate kilobytes, or m or M to
indicate megabytes. The default value is chosen at runtime
based on system configuration. For more information, see
HotSpot Ergonomics
Examples:
-Xmx83886080
-Xmx81920k
-Xmx80m
On Solaris 7 and Solaris 8 SPARC platforms, the upper limit for
this value is approximately 4000m minus overhead amounts. On
Solaris 2.6 and x86 platforms, the upper limit is approximately
2000m minus overhead amounts. On Linux platforms, the upper limit
is approximately 2000m minus overhead amounts.
To use this option, you would start your application with a command like
java -Xmxn1024m -jar foo.jar
In Eclipse, you can add command-line options as well. This page on eclipse.org describes how to add command-line arguments to a Java program. You should add the -Xmxn1024m (or some other sufficiently large heap specification) to the "VM arguments" section of the dialog shown on that site.
You probably have too little heap space to hold an array of the size you are targeting. You can increase the size of your heap with command line switches. For example, to set it to 256MB, include this switch:
-Xmx256m
If you multiply height * width * 4 (4 is the storage in bytes for an int) you can get a rough gauge of the amount of heap you will need, assuming the rest of the program does not need a significant amount. You will certainly need some more heap than that quick calculation suggests. Add maybe 20% extra, and try that out.
To get a better number than a rule-of-thumb calculation, you can look into heap profilers. There are several open source options:
http://java-source.net/open-source/profilers
See http://javarevisited.blogspot.com/2011/05/java-heap-space-memory-size-jvm.html for a good discussion of the heap in Java.
memory is not enough for your program, may be memory leak there.
you may try below,if not solve try to increase jmx value.
java -xmx1g -xms512m
Depends on how much heap the JVM has. If you run it on the command line try adding -Xmx512m. If you work in an IDE add it to the "Run" properties.
An int is 32 bits (i.e. 4 bytes). So your array requires 8192*8192*4 bytes. This comes out at 256MB.
Java called with default arguments has only 64MB of heap space.
To get a larger heap, call Java using the -Xmx argument (Maximum memory size).
e.g. java -Xmx300M
Increase your memory arguments for your Java process by adding this flag to increase the heap. You might need to play around to get the optimal size for the heap. This will set the "max" heap size. The default is probably really small. 64M is a common max size for many Java EE containers.
*Note I'm not saying this is exactly the size you'll need. Your unique case will dictate the size you'll need which you may need to experiment with.
-Xmx256M

Categories

Resources