I have two questions with respect to Java memory management.
Where are static and instance variables stored? I believe static var are stored in premgen, but I am not sure about the instance var.
Is permgen a subset of heap or method area?
When I was googling, I found some stating that static vars are stored in the permgen section of heap, but others stating that permgen is a subset of the method area. If the later is true, then are the static variables stored in method area?
Where are static and instance variables stored?
It changes from Java version to Java version, runtime to runtime. eventually, Java is written to hide away memory details such as "Where do my objects sit in memory?".
Some compilers might optimize the objects away, declare them on the stack or any place they find suitable.
So the answer is - "We can't tell for sure, does it really matter for Java anyway?"
Stack or Heap storage depends more on whether the variable is primitive, e.g. int, or Objects.
Unless your primitive is part of an object, it will most likely be created on the call stack. Otherwise it will be created as part of the object on the heap.
And as others have mentioned, there is no more PermGen in stable Java ( i.e. 8 ) and moving forward.
PermGen elimination in JDK 8
Do Java primitives go on the Stack or the Heap?
Where does the JVM store primitive variables?
in JDK 8 there is no PERMGEN any more:
With the advent of JDK8, we no longer have the PermGen. No, the metadata information is not gone, just that the space where it was held is no longer contiguous to the Java heap. The metadata has now moved to native memory to an area known as the “Metaspace”.
check below link for details and for answering your 2nd question too.
http://www.infoq.com/articles/Java-PERMGEN-Removed
Related
i was wondering what spcecifically the heap stores in its nodes? I understand a heap to be a kind of binary tree and from what I have studied of trees, the nodes contain a reference to the value stored. My question was in the case of the java heap, does the node structure contain a Java object reference to the location (stored somewhere else in RAM) of a stored object (the case a reference type), or a pointer to the memory location of the data type, or some other representation?
Reading about the subject I thought it strange that where an object is defined as a local variable, and is thus present both in the stack, as well as the heap (until I realized that this would be necissary since local variables are supposed to only be viziable to the relevent thread with the relevant thread stack) - however I still thought it odd to use a pair of object references like this and wondered perhaps whether I had misunderstood its implementation?
The Java heap just has to confirm to part 2.5.3 of the VM specification. There is no single implementation, so your question does not make sense strictly speaking.
There's too little space here to fully clarify the Oracle server and client VM. You should read into it for your target VM and ask more specific questions if you get stuck.
You should compare the java stack and heap to the related concepts (stack allocation vs. malloc) in C with the difference that you do not need to free them due to GC and are not allowed to do pointer arithmetics because objects can get moved at any time.
The java memory model on the other hand prescribes what guarantees the VM has to make under concurrent access to various types of variables. Compare to C++'s std::atomic. This is unrelated to the memory layout.
If I have not assumed or learned anything wrong then all variables that we assign, takes a certain place of the RAM.
For example while working with Java array when we try to print an array it prints a "location".
String [] a = new String [2]
System.out.println(a)
[Ljava.lang.String;#be6280
Now is there any way to set that location?
I think it is possible using C++, is it? If any language offers this thing then I should be capable to scan my RAM for that variable or array location at least exhaustively. Can't I? Have anyone tried doing it?
You could use the Unsafe class as shown here. This is specific to the HotSpot JVM, but it's probably a start.
No, this is in Java not possible. For developpers of debugging tools, and special other tools there is a backdoor for direct memory access, but this is not that what you and 99.995% of other people want.
Memory in Java (RAM) is controlled by the Java Virtual Machine (JVM) and there is no way, unlike other programming environments, to handle memory allocation manually in the Java heap. You can allocate memory outside Java using JNI or NIO.
Setting the location is very unsafe as the GC can be performed at any time and a corrupt memory structure will crash the JVM.
You can get the location of a object, but you don't need this. You can use Reflections to get any field of an object, but you can use Unsafe to extract or set a field.
I would like to test how many bytes an object reference use in the Java VM that I'm using. Do you guys know how to test this?
Thanks!
Taking the question literally, on most JVMs, all references on 32-bit JVMs take 4 bytes, one 64-bit JVMs, a reference takes 8 bytes unless -XX:+UseCompressedOops has been used, in which case it takes 4-bytes.
I assume you are asking how to tell how much space an Object occupies. You can use Instrumentation (not a simple matter) but this will only give you a shallow depth. Java tends you break into many objects something which is C++ might be a single structure so it is not as useful.
However, ifyou have a memory issue, I suggest you a memory profiler. This will give you the shallow and deep space objects use and give you a picture across the whole system. This is often more useful as you can start with the biggest consumers and optimise those as even if you have been developing Java for ten years+ you will only be guessing where is the best place to optimise unless you have hard data.
Another way to get the object size if you don't want to use a profiler is to allocate a large array and see how much memory is consumed, You have to do this many times to get a good idea what the average size is. I would set the young space very high to avoid GCs confusing your results e.g. -XX:NewSize=1g
It can differ from JVM to JVM but "Sizeof for Java" says
You might recollect "Java Tip 130: Do You Know Your Data Size?" that described a technique based on creating a large number of identical class instances and carefully measuring the resulting increase in the JVM used heap size. When applicable, this idea works very well, and I will in fact use it to bootstrap the alternate approach in this article.
If you need to be fairly accurate, check out the Instrumentation framework.
This one is the one I use. Got to love those 16-byte references !
alphaworks.ibm.heapanalyzer
I was going through the document in Java Memory Management and in that I came across PermSize which I couldn't understand. The document says that it stores, "JVM stores its metadata", but I couldn't exactly get what is meant by metadata. I was googling and somewhere I read it stores a value object (user defined object).
What kind of objects are stored there? An example with an explanation would be great.
A quick definition of the "permanent generation":
"The permanent generation is used to
hold reflective data of the VM itself
such as class objects and method
objects. These reflective objects are
allocated directly into the permanent
generation, and it is sized
independently from the other
generations." [ref]
In other words, this is where class definitions go (and this explains why you may get the message OutOfMemoryError: PermGen space if an application loads a large number of classes and/or on redeployment).
Note that PermSize is additional to the -Xmx value set by the user on the JVM options. But MaxPermSize allows for the JVM to be able to grow the PermSize to the amount specified. Initially when the VM is loaded, the MaxPermSize will still be the default value (32mb for -client and 64mb for -server) but will not actually take up that amount until it is needed. On the other hand, if you were to set BOTH PermSize and MaxPermSize to 256mb, you would notice that the overall heap has increased by 256mb additional to the -Xmx setting.
This blog post gives a nice explanation and some background. Basically, the "permanent generation" (whose size is given by PermSize) is used to store things that the JVM has to allocate space for, but which will not (normally) be garbage-collected (hence "permanent") (+). That means for example loaded classes and static fields.
There is also a FAQ on garbage collection directly from Sun, which answers some questions about the permanent generation. Finally, here's a blog post with a lot of technical detail.
(+) Actually parts of the permanent generation will be GCed, e.g. class objects will be removed when a class is unloaded. But that was uncommon when the permanent generation was introduced into the JVM, hence the name.
The permament pool contains everything that is not your application data, but rather things required for the VM: typically it contains interned strings, the byte code of defined classes, but also other "not yours" pieces of data.
lace to store your loaded class definition and metadata. If a large code-base project is loaded, the insufficient Perm Gen size will cause the popular Java.Lang.OutOfMemoryError: PermGen.
Hmmm. Is there a primer anywhere on memory usage in Java? I would have thought Sun or IBM would have had a good article on the subject but I can't find anything that looks really solid. I'm interested in knowing two things:
at runtime, figuring out how much memory the classes in my package are using at a given time
at design time, estimating general memory overhead requirements for various things like:
how much memory overhead is required for an empty object (in addition to the space required by its fields)
how much memory overhead is required when creating closures
how much memory overhead is required for collections like ArrayList
I may have hundreds of thousands of objects created and I want to be a "good neighbor" to not be overly wasteful of RAM. I mean I don't really care whether I'm using 10% more memory than the "optimal case" (whatever that is), but if I'm implementing something that uses 5x as much memory as I could if I made a simple change, I'd want to use less memory (or be able to create more objects for a fixed amount of memory available).
I found a few articles (Java Specialists' Newsletter and something from Javaworld) and one of the builtin classes java.lang.instrument.getObjectSize() which claims to measure an "approximation" (??) of memory use, but these all seem kind of vague...
(and yes I realize that a JVM running on two different OS's may be likely to use different amounts of memory for different objects)
I used JProfiler a number of years ago and it did a good job, and you could break down memory usage to a fairly granular level.
As of Java 5, on Hotspot and other VMs that support it, you can use the Instrumentation interface to ask the VM the memory usage of a given object. It's fiddly but you can do it.
In case you want to try this method, I've added a page to my web site on querying the memory size of a Java object using the Instrumentation framework.
As a rough guide in Hotspot on 32 bit machines:
objects use 8 bytes for
"housekeeping"
fields use what you'd expect them to
use given their bit length (though booleans tend to be allocated an entire byte)
object references use 4 bytes
overall obejct size has a
granularity of 8 bytes (i.e. if you
have an object with 1 boolean field
it will use 16 bytes; if you have an
object with 8 booleans it will also
use 16 bytes)
There's nothing special about collections in terms of how the VM treats them. Their memory usage is the total of their internal fields plus -- if you're counting this -- the usage of each object they contain. You need to factor in things like the default array size of an ArrayList, and the fact that that size increases by 1.5 whenever the list gets full. But either asking the VM or using the above metrics, looking at the source code to the collections and "working it through" will essentially get you to the answer.
If by "closure" you mean something like a Runnable or Callable, well again it's just a boring old object like any other. (N.B. They aren't really closures!!)
You can use JMP, but it's only caught up to Java 1.5.
I've used the profiler that comes with newer versions of Netbeans a couple of times and it works very well, supplying you with a ton of information about memory usage and runtime of your programs. Definitely a good place to start.
If you are using a pre 1.5 VM - You can get the approx size of objects by using serialization. Be warned though.. this can require double the amount of memory for that object.
See if PerfAnal will give you what you are looking for.
This might be not the exact answer you are looking for, but the bosts of the following link will give you very good pointers. Other Question about Memory
I believe the profiler included in Netbeans can moniter memory usage also, you can try that