Process RAM usage - java

Just a question that i wasn't able to find an answer too after a lot of searching.
Can a single instance of a java software use more than 10GB of ram on a server that has more than 128GB of RAM installed and still run fine without any issues? Or are there any limits on memory usage of a single instance of process.

The maximum heap size for the Java Virtual Machine will most likely be constrained by your OS. Theoretically, if you have a 64-bit machine, you can allocate the JVM a heap size of 2^64. Since you have 128GB of memory, this is most likely your upper limit. Although, without knowing what your OS is, other constraints may arise.
edit: this link contains a guide to helping you setup your memory limits for your JVM on some different operating systems -> Guide

Related

swap out of memory

I am working on java application and I have set the following configurations in the VM option
-Xms and -Xmx options are set to 1024m
-XX:MaxPermSize=128m Hardware :32 bit Windows 7 system with 2GB ram.
I often encounter with java out of swap space error. What could be the reason? Please help me.
The reason is that your operating system is not configured with enough swap space for the job mix that you are running. Swap space is an area on disc where the operating system puts copies memory pages when there are more virtual memory pages than physical memory pages.
So what has happened is that your JVM has asked for more virtual memory than the operating system can give it.
(Updated to include Peter's comments)
Some possible fixes:
Add more physical memory assuming that the hardware and OS allow this. (In this case, the answer the OS should allow it ...)
Configure the system with more swap space.
Kill some of the other non-essential applications and services running on the machine.
Change the Java application's JVM options to reduce the heap size.
The Java release notes under "Hotspot VM" say about this exact error;
If you see this symptom, consider increasing the available swap space
by allocating more of your disk for virtual memory and/or by limiting
the number of applications you run simultaneously. You can also
decrease your usage of memory by reducing the value of the -Xmx flag,
which limits the size of the Java object heap.
In other words, your machine is doing too many other things using up memory to be able to supply the 1GB you're telling the Java VM that it should be able to use.

32 bit JVM commits >3G virtual memory

we have a 32 bit JVM running under 64 bit RHEL5 on a box which has plenty of memory (32G). For different reasons, this process requires a pretty large managed heap and permgen space -- currently, it runs with the following VM arguments:
 
-Xmx2200M -XX:MaxPermSize=128M -XX:+CMSClassUnloadingEnabled
I have started seeing JVM crashes recently because it - seemingly - ran out of native memory (it could not create native threads, or failed to allocate native memory, etc.). These crashes were not (directly) related to the state of the managed heap, as when those crashes happened the managed heap was ~50-70% full.
I know that the memory reserved for the managed process is close to 2.5 G which leaves not more than 0.5G for the JVM itself, BUT
- I don't understand why 0.5 isn't enough for the JVM, even if there is constant GCing going on
- the real question is this: when I connect to the process using jconsole, then it says that (currently)
Committed virtual memory: 
3,211,180 kbytes
Which is more than 3G. I can imagine that for some reason JVM thinks that it has
3,211,180 kbytes (3.06G) of memory is but when it tries to go over 3G the memory allocation fails.
Any ideas on
a) why does this happen
b) how is it possible to avoid this
Thanks.
Mate
There is a lot of overhead in a typical VM that is not counted in the VM accounting because it is essentially stolen by the native elements of the process - e.g. mapping in .so files that are used for performing native level code for system libraries are not counted in the base VM accounting. your typical shared library is mapped in at the top GB of memory, so if you try to allocate memory into this region you will be denied, because it would overrun with the shared libraries' memory region - memory allocation on most OS's is performed by a simple bar that is raised when you ask for more memory. When you ask for memory and the bar conflicts with other uses, then it simply fails. Most of the details that follow are about this.
You need to avoid needing so much memory in a 32bit process. This is the fundamental challenge. It is trivial to get a 64bit VM that will allow you to make use of so much more memory than would be otherwise accessible - it is just simply usable in this situation.
If you are using a 32bit process, there is a high probability that you are encountering the effective address space limit of the 32bit process. For windows, this is a maximum of about 3GB - anything above this is reserved for I/O space and the kernel. You can move this, but it has a tendency to break applications/drivers that are designed for the 32bit OS.
For Linux, you end up with ~3GB of usable addressable RAM per process, the rest is used up by things like the kernel and mapped in shared libraries. The limit is referred to as the 'address space limit', and I presume it can be tuned.
How to avoid it? Well, for the most part, you can't, it's a physical limitation of the 32bit address space and the needs of having the kernel/IO in the same address space as the process for a 32bit OS.
With 64 bit OS's you have (most of) all of the 64 bit address space to play around with, which is extensively more than you need to use.
When you start a JVM it allocates it maximum size immediately. How much of that memory is used doesn't really matter. Your application can address about 3 GB of which about 2.3 GB you have allocated to heap and perm gen. The rest is available for shared libraries (typically around 200 MB) and thread stacks.
Worrying about why you can't use the full 3 GB of address isn't very useful when the solution is relatively trivial (use a 64-bit JVM) I am assuming you don't have any shared libraries which are only available in 32-bit. However if you do have additional shared libraries they can easily be using 100s of MB.

Why does Java use so much more memory on my Linux server?

I have two Java programs. On my computer, one of them uses 9MB of RAM and the other uses 77MB. But when I upload them to a server, the same programs use 382MB and 186MB! Is there a way to stop this from happening?
How do you measure the memory usage in each case? Different operating systems have a different concept of what constitutes "memory usage".
64-bit systems require more memory than 32-bit systems due to the increased pointer (reference in Java speak) size. Are you using a 32-bit OS and JVM on your desktop computer?
Are you using different JVM options? A moderately active Java application usually ends up using all the memory that is permitted by the -Xmx option, to minimize the CPU time spent on garbage collection. In addition, the default maximum heap space is determined in relation to the available physical memory - if the server has more memory, the Java applications are bound to use more memory as well.
Server JVMs (see the -server JVM option) have different settings and favor performance over memory usage. The -server option is the default on 64-bit systems.
Are you absolutely certain that the application load is the same in both cases?
It is quite common for applications to allocate virtual memory in large chunks to improve performance and efficiency. Nobody bothers to optimize such things because they have no effect. If you don't actually have a problem, there's nothing to fix.
Virtual memory is not a scarce resource. Attempting to reduce the consumption of vm is wasted effort.
How did you measure that numbers?
Comparing the numbers of Windows Task Manager and ps(1) on Linux is futile because they are computed differently. For example shared libraries and shared memory are attributed differently on both platforms. Also the memory management is completely different.
If, on the other hand, you refer to numbers gathered from within your apps via Runtime (or similar), then you have to look how the JVM was started with what parameters. Most important are the parameters -Xmx and -Xms but you might lookup several others in the doc of either java or javaw.
Related to point 1:
How to measure actual memory usage of an application or process?
Unless you've explicitly set it (e.g command line arguments like -Xmx128M), the default maximum heap size of the JVM depends on the amount of RAM available.

JVM Heapsize on 32 bit OS

I am using 32 bit win.7 and using Eclipse. Also having 4GB RAM.
I want to allocate my java application a maximum heapsize of around 3 GB, but I am able to allocate maximum 1.5GB through VM arguments -Xmx1056m.
What should I do? If I Install a 64 bit win.7. it would be able then to allocate 3GB heapsize to my app?
A regular 32-bit Windows process can only address 2GB of memory, even if you have more memory available. You can find the memory limits for different Windows versions here.
Since the VM need memory for more things than just the heap, the max heap size will be slightly less than the maxmimum memory available to the process. Usually, you can tweak the heap up to around 1.6GB for a 32-bit Windows VM.
You need a 64bit OS and 64bit VM to allocate this much RAM.
I don't have the link right now that describes the JVM memory management process. The limitation you have stumbled upon is a limitation of how Java performs garbage collection. The Java memory heap must be a contigious block. The garbage collection algorithms were optimized for this design limitation so they can perform efficiently. In a 32bit operating system, you are not in control of what memory addresses device drivers are loaded into. On Windows, the OS will only reallocate the device driver base address stored in the DLL if it conflicts with an already loaded code. Other operating systems may reallocate all device drivers on load so they live in a contiguous block near the kernel.
Basically, the limitation is the same for 64bit and 32bit operating systems. It's just on a 64bit OS there are several more address ranges to choose from. Make sure you use a 64bit JVM to match the OS. That's why the 64bit OS is able to find a larger contigious block of memory addresses than the 32bit OS.
EDIT: Additionally the 32bit JVM has a max heap size limit of 2GB.
REFERENCE:
http://publib.boulder.ibm.com/infocenter/javasdk/tools/index.jsp?topic=/com.ibm.java.doc.igaa/_1vg00014884d287-11c3fb28dae-7ff6_1001.html
Java maximum memory on Windows XP
http://forums.sun.com/thread.jspa?messageID=2715152#2715152
What you will need is not only a 64bit OS and a 64bit VM, but also more memory.
On a 32bits Windows system the virtual address space is split with 2 GB for kernel operations and 2 GB for user applications. So you're screwed.
There's one possible but very unlikely workaround: you can enable the /3GB switch to raise this limitation and have the system allocate 1GB of virtual address space for for kernel operations and 3GB for user applications (if they are /LARGEADDRESSPACEAWARE).
Unfortunately, the 32bits Sun/Oracle HotSpot JVM isn't LARGEADDRESSAWARE (that I know of), and other 32bits JVM likely aren't either.
But think about it: even if you were able to do that, you would use all the memory available for you system. Nothing would be left for other programs after you've allocated your 3GB of heap for your JVM. Your system would be swapping to disk all the time. It would be unusable.
Just get a 64bis OS with more RAM. That's all there is for you, short of finding ways to have your program use less memory.

application terminates on heapsize

I have defined -Xmsx 1.3GB in the java VM parameters and my Eclipse does not allow more than this, when running the application I got the below exception:
Exception in thread "Thread-3" java.lang.OutOfMemoryError: Java heap space
What can I do?
You can set the maximum memory eclipse uses with -mx1300m or the like. This limitation will be because you are running 32-bit java on Windows. On a 64-bit OS, you won't have this problem.
However, its the maximum memory size you set for each application in eclipse which matters. What have you set in your run options in eclipse?
Your question is very unclear:
Are you running the application in a new JVM?
Did you set the -Xmx / -Xms parameters in the launcher for the child JVM?
If the answer to either of those questions is "no", then try doing ... both. (In particular, if you don't set at least -Xmx for the child JVM, you'll get the default heap size which is relatively small.)
If the answer to both of those questions is "yes", then the problem is that you are running into the limits of your hardware and/or operating system configuration:
On a typical 32bit Windows, a user process can only address a total 2**31 bytes of virtual memory, and some of that will be used by the JVM binaries, native libraries and various non-heap memory allocations. (On a 32 bit Linux, I believe you can have up to 2**31 + 2**30). The "fix" for this is to use a 64bit OS and a 64bit JVM.
In addition, a JVM is limited on the amount of memory that it can request by the resources of the OS'es virtual memory subsystem. This is typically bounded by the sum of the available RAM and the size of the disc files / partitions used for paging. The "fix" for this is to increase the size of the paging file / partition. Adding more RAM would probably be a good idea too.
You may want to look at the aggressive Heap option http://java.sun.com/docs/hotspot/gc1.4.2/#4.2.2.%20AggressiveHeap|outline
It solved a similiar issue for me.

Categories

Resources