In Java, Heap Memory vs System Memory - java

I have a java process running on Ubuntu OS, which as per jconsole shows heap memory as 150 MB but for the same process "system monitor" in ubuntu shows approx. 470 MB. Also when I check size of jars in classpath it comes around 200 MB.
I am considering that all jars that are present in classpath will be loaded in JVM for that particular process.
Please can any one help me understand? ... am i missing on something?

Ubuntu system manager is shows totally memory occupied by JRE(Java Runtime Environment). JRE contains other memory such as Stack Memory(Native & Java stacks), Code memory(where java classes and code is present), along with Heap memory. Thus, Heap memory that JConsole will always be less than what it's shown by memory manager. Moreover, JRE manages memory on its own, independent of OS. Thus, it may be possible that It has already acquired more memory from OS, but your program at present is requiring less memory. So, It's holding so that whenever next time your program requires more memory, it doesn't have to go to OS to request for more memory. Because system calls for requesting more memory from OS are expensive.
Coming to your JAR loading question. All the jars, present in your classpath may not be necessarily loaded by your JRE. They'll be loaded on demand basis. Thus, It may happen that there're 100 Jars in your classpath, but only 2 classes in one of the jars are loaded. Thus, those two classes will be loaded in the memory.
For performing more sophisticated memory analysis, I recommend you to use JVisualVM, that comes with several plugins, to analyze memory in the program.

Related

Setting VM Options in JAR (-Xmx) [JVM Startup Parameters]

I searched for this question and got many many matches on stackoverflow itself, but the answers in there are sort of contradicting.
In Ques: How to add VM options to jar? the top-voted+accepted says it is not possible and also most of the answers in Ques: Can I set Java max heap size for running from a jar file? say that No, it is not possible. Most of these answers saying "not possible" were given by people with high reputation and therefore I assume they cannot all be just wrong by coincidence.
One guy said that it can be done by using this others said to make a installer for it, or use Launch4J or make batch files or make another JAR and run the main through this but most of these did not get many votes as compared to those saying "no".
So is it really possible or not? My problem is that I run out of heap space and therefore I want to increase it in JAR.
Q1. I have set increased Heap space from netbeans, will it be increased in the JAR too? (I think no, I am just confirming this one)
Q2. What should I do now to make the increased heap space in JAR? (I am looking for an easy to do way because I do not have knowledge about batch scripting and all and also that I am already putting an Installer to place these files(Advanced Installer) so I do not want to put additional Installers to do this) Is there a simple way out?
A1. When running your application from Netbeans you spawn a JVM process which executes the application. Setting the heap size from Netbeans simply means it will launch the JVM with the max heap size you configured. It will not effect the jar you are creating in any way.
A2. You cannot configure the heap size inside your jar. This cannot be done either programmatically or by some Manifest configuration.
Setting the max heap can only be done by passing the right JVM options when launching the JVM and this can be done by either having some kind of a startup script or by using a Java launcher as mentioned in the many answers on stackoverflow.
The option of a startup script is better in my opinion as it allows the end user to control the memory setting if needed. With a launcher the memory settings are usually hardcoded and cannot be changed.
I suggest you should take a look at the many startup scripts that are available for various open source Java products. Another option would be searching for "java startup script" or similar.
Q1. I have set increased Heap space from netbeans, will it be increased in the JAR too?
The memroy use in JVM can be explain blow:
init
represents the initial amount of memory (in bytes) that
the Java virtual machine requests from the operating system
for memory management during startup. The Java virtual machine
may request additional memory from the operating system and
may also release memory to the system over time.
The value of init may be undefined.
used
represents the amount of memory currently used (in bytes).
committed
represents the amount of memory (in bytes) that is
guaranteed to be available for use by the Java virtual machine.
The amount of committed memory may change over time (increase
or decrease). The Java virtual machine may release memory to
the system and committed could be less than init.
committed will always be greater than
or equal to used.
max
represents the maximum amount of memory (in bytes)
that can be used for memory management. Its value may be undefined.
The maximum amount of memory may change over time if defined.
The amount of used and committed memory will always be less than
or equal to max if max is defined.
A memory allocation may fail if it attempts to increase the
used memory such that used > committed even
if used <= max would still be true (for example,
when the system is low on virtual memory).
Q2. What should I do now to make the increased heap space in JAR?
May be you can do this by some tricks, like wrap you jar in another jar.
Copy you jar in another_project/resources/ and package another_project to a jar.
The code blow is just a sample, maybe can not run or compile.
public static void main(String[] args) {
String jarFile = this.getClass().getResource("/resources/you.jar").getFile();
Runtime.getRuntime().exec("java -Xmx256m -jar "+jarFile);
}

How do you deal with Java applications on the client requiring a lot of memory ("-J-Xmx"?

I have a Java SE desktop application which uses a lot of memory (1,1 GB would be desired). All target machines (Win 7, Win Vista) have plenty of physical memory (at least 4GB, most of them have more). There is also enough free memory.
Now, when the machines have some uptime and a lot of programs were started and terminated, the memory becomes fragmented (this is what I assume). This leads to the following error when the JVM is started:
JVM creation failed
Error occurred during initialization of VM
Could not reserve enough space for object heap
Even closing all running programs doesn't help in such a situation (despite Task Manager and other tools report enough free memory). The only thing thas helps is to reboot the machine and fire up the Java applicaton as one of the first programs launched.
As far as I've investigated, the Oracle VM requires one contiguous chunk of memory.
Is there any other way to assign about 1,1 GB of heap to my java application when this amount is available but may be fragmented?
I start my JVM with the following arguments:
-J-client -J-Xss2m -J-Xms512m -J-Xmx1100m -J-XX:PermSize=64m -J-Dsun.zip.disableMemoryMapping=true
Is there any other way to assign about 1,1 GB of heap to my java application when this amount is available but may be fragmented?
Use an OS which doesn't get fragmented virtual memory. e.g. 64-bit windows or any version of UNIX.
BTW It is hard for me to imagine how this is possible in the first place but I know it to be the case. Each process has its own virtual memory so its arrangement of virtual memory shouldn't depend on anything which is already running or has run before.
I believe it might be a hang over from the MS-DOS TSR days. Shared libraries loaded are given absolute addresses (added to the end of the signed address space, 2 GB, the high half is reserved for the OS and the last 512 MB for the BIOS) in memory meaning they must use the same address range in every program they are used in. Over time the maximum address is determined by the lowest shared library loaded or used (I don't know which one by I suspect the lowest loaded)

Java RAM increases although Heap stays same? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Limit jvm process memory on ubuntu
In my application I'm uploading documents to a server, which does some analyzing on it.
Today I analyzed my application using jconsole.exe and heap dumps as I tried to find out if I'm having memory issues / a memory leak. I thought I might suffer of one since my application is growing very much on RAM while the application is running.
As I watched the heap / codecache / perm gen etc. memory with jconsole after some runs, I was surprised as I saw the following:
picture link: https://www7.pic-upload.de/13.06.12/murk9qrka8al.png
As you can see at the jconsole on the right, the heap is increasing when I'm doing analyzing-related stuff, but it's also decreasing again to its normal size when the work is over. On the left you can see the "htop" of the sever the application is deployed on. And there it is: The RAM is, although the heap acts normally and it also seems the garbage collector is running correct, incredible high at almost 3,2gb.
This is now really confusing me. I was thinking if my java vm stack could have to do something with this? I did some research and what I found spoke about the vm stack as a little memory with only a few megabytes (or even only kb).
My technical background:
The application is running on glassfish v.3.1.2
The database is running on MySQL
Hibernate is used as ORM framework
Java version is 1.7.0_04
It's implemented using VAADIN
MySQL database and glassfish are the only things running on this server
I'm constructing XML-DOM-style documents using JAXB during the analysis and save them in the database
Uploaded documents are either .txt or .pdf files
OS is linux
Solution?
Do you have any ideas why this happens and what I can do for fixing it? I'm really surprised at the moment, since I thought the memory problems came from a memory leak which causes the heap to explode. But now, the heap isn't the problem. It's the RAM that goes higher and higher while the heap stays on the same level. And I don't know what to do to resolve it.
Thanks for every thought you're sharing with me.
Edit: Maybe I should also state out that this behaviour is currently making me impossible to really let other people use my application. When the RAM is full and the server doesn't respond anymore I'm out.
Edit2: Maybe I should also add that this RAM keeps increasing after every successfull further analyzation.
There are lots more things that use memory in a JVM implementation than the Heap Settings.
The Heap settings via -Xmx only controls the Java Heap, it doesn't control consumption of native memory by the JVM, which is consumed completely differently based on implementation.
From the following article Thanks for the Memory ( Understanding How the JVM uses Native Memory on Windows and Linux )
Maintaining the heap and garbage collector use native memory you can't control.
More native memory is required to maintain the state of the
memory-management system maintaining the Java heap. Data structures
must be allocated to track free storage and record progress when
collecting garbage. The exact size and nature of these data structures
varies with implementation, but many are proportional to the size of
the heap.
and the JIT compiler uses native memory just like javac would
Bytecode compilation uses native memory (in the same way that a static
compiler such as gcc requires memory to run), but both the input (the
bytecode) and the output (the executable code) from the JIT must also
be stored in native memory. Java applications that contain many
JIT-compiled methods use more native memory than smaller applications.
and then you have the classloader(s) which use native memory
Java applications are composed of classes that define object structure
and method logic. They also use classes from the Java runtime class
libraries (such as java.lang.String) and may use third-party
libraries. These classes need to be stored in memory for as long as
they are being used. How classes are stored varies by implementation.
I won't even start quoting the section on Threads, I think you get the idea that
the Java Heap isn't the only thing that consumes memory in a JVM implementation, not everything
goes in the JVM heap, and the heap takes up way more native memory that what you specify for
management and book keeping.
Native Code
App Servers many times have native code that runs outside the JVM but still shows up to the OS as memory associated with the process that controls the app server.

What are the empty 'pathname' entries of '/proc/smap' for a Java process?

I seem to have a huge memory leak in a large Java application. But the leak does not seem to be within the JVM memory itself (ie: heap, eden, survivor, code, perm_gen, etc.) since I don't run out of this type of memory (ie: it goes up during use, but it goes back down eventually when the GC runs).
My problem is that I run out of system RAM! So I'm tracing the '/proc/smap' and using the 'pmap' tools to see what is going on. For example, the 'so', 'tmp' and 'jar' entries stay relatively stable and don't increase all that much in mapped items nor do their mapped size increase unexpectedly either (as expected).
But what does grow over time significantly is the number of mapped entries that are NOT assigned to a particular pathname. Over time, there are more and more of these and they don't seem to go away.
I can understand what's going on for example when the JVM maps a JAR file, but what exactly are these pathless mappings? Anyone have an explanation/example?
Also, can anyone confirm that the '[heap]' entry is the actual 'JVM' code heap and has nothing to do with the XMM ans XMS heap space.
I'm using Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_07-b03) on a 2.6.16 Linux distro.
Which application server do you use?
It might be c heap memory leak, so can you upgrade your jdk and have a try?

Running multiple jvms for different applications in same machine

We are getting frequent out of memory errors in our dev. machines We are running webshpere, eclipse, soap UI and maven in it. Our server gets down due to this "out of memory errors" when we restart our applications in websphere 2/3 times, We already increased the virtual memory setting in wesphere to 1GB.
So what i did was copied the jre we use in eclipse and maven folders so that each of these uses individual jvms. But the performance of websphere is same. 2/3 restarts and out of memory errors.
Is there any may of making eclipse and maven use different jvms other than websphere's?
In response to the question:
If you start java multiple times, multiple copies of java will be running with each their own memory. Eclipse and websphere are probably started separately so use independent memory. Your trouble should not be there.
In response to your problem
Out of Memory
Both Eclipse and Websphere can gobble up memory like there's not tomorrow. Look al the -X flags, the flag for perm gen space should be added to the flag for heap space to get the memory consumption. Also allow some overhead for the OS, windowing environment, e-mail client, browser (500 MB - 1 GB or so, depending on the OS and what you're running). So it can be that the computer is out of memory.
More frequently the amount of memory assigned to the jvm is just not enough. Java has not been started with enough memory for the app assigned to it. It's up to you to deduce if it is Heap Space which ran out, or PermGen space. Both can be adjusted, have a look at this website. The flags are -Xmx and -XX:MaxPermSize. Look at the start scripts for Websphere, as that's the one complaining.
Recommendation
Check which kind of memory is out, and search for that on stack overflow; either PermGen or Heap Space should do.
You should set the JVM Xmx parameter less than or equal to 256MB in all the three process. It will never cross the 256 MB limit (considering that the program does not have memory leaks)
Copying the folder won't be enough.
In Eclipse, go to Preferences, under Java-> Installed JREs make sure the JRE used is on a different path than the one used by WebSphere.

Categories

Resources