I am running at the same computer a java game-server and a game-client
the game-client with
java -Xms512m -Xmx1024m -cp etc etc
and the game-server
java -Xmx1024M -Xms1024M -jar etc etc
Computer Properties:
Windows 7 64 bit
8GB RAM
CPU i5-2500 # 3.3GHz
Intel HD Graphics
Problem: The game-client experience serious lags. At the game-server is also connected via LAN another player with no lag issues!
Has the problem of the lag to do anything with java virtual machine? Am I using one instance of machine or two?
Can I setup something different in order to optimize the performance?
I am thinking that the problem has to do with the fact that one machine is running and its max memory is not enough for both instances, but I do not really know how to solve that.
Edit: No app run out of memory.
Solution:
1:
Updated Java version from:
java version "1.6.0_31"
Java(TM) SE Runtime Environment (build 1.6.0_31-b05)
Java HotSpot(TM) 64-Bit Server VM (build 20.6-b01, mixed mode)
to
java version "1.7.0_15"
Java(TM) SE Runtime Environment (build 1.7.0_15-b03)
Java HotSpot(TM) 64-Bit Server VM (build 23.7-b01, mixed mode)
2:
Changed the server properties in order to minimize requirements, this seems to be the main reason.
3:
Increased memory:
game-client with java -Xms1024m -Xmx1024m -cp etc etc
and the game-server java -Xmx2048M -Xms2048M -jar etc etc
Server runs at about 700MB now.
Has the problem of the lag to do anything with java virtual machine?
Possibly. You haven't presented enough evidence to be sure one way or the other.
The puzzling thing is that the your client running on a different machine is not laggy.
Am I using one instance of machine or two?
You are running two copies of java then you will have two JVMs.
Can I setup something different in order to optimize the performance?
The answer is probably yes. But you haven't provided enough information to allow us to make solid suggestions.
Lagginess can be caused by a number of things, including:
A network with high latency.
A JVM that has a heap that is too small.
An application that is generating lots of garbage and triggering an excessive number of garbage collection.
A mix of applications that is competing for resources; e.g. physical memory, CPU time or disc or network I/O time.
If you are going to get to the root cause of your problem, you will need to do some monitoring to figure out which of the above is the likely problem. Use the task manager or whatever to check whether the system is CPU bound, short of memory, doing lots of disk or network stuff, etc. Use VisualVM to see what is going on inside the JVMs.
Alternatively, you could try to fix with some totally unscientific "knob twiddling":
try making the -Xms and -Xmx parameters the same (that may reduce lagginess at the start ...)
try increasing the sizes of the JVMs' heaps; e.g. make them 2gb instead of 1gb
try using a more recent version of Java
try using a 64 bit JVM so that you can increase the heap size further
try enabling the CMS or G1 collectors (depending on what version of JVM you are using).
If I knew more about what you were currently using, I could possibly give more concrete suggestions ...
You are using two java apps in same computer, resulting in 2 JVMs running.
For a 64 bit system with 8GB RAM, its recommended to use max 2GB(25% of Physical memory OR 75% of free physical memory up to 2 GB) for JVM for better performance.
You may have to look on JVM size adjustment. For better performance, Xms and Xmx size can be kept same with a max size bracket.
Assigning memory size to Heap is not the only area to think of. JVM uses more memory than just Heap. Others memory areas like Thread Stack, Method areas, Class Loader Subsystem, Native Method Stack etc.
While both of the apps(game-server, game-client) are running, there is a chance of issue in memory management by OS between both apps, resulting in slowness.
In that case, client app can be deployed in another core, if available.
Related
This question already has answers here:
How JVM -XX:MaxRAM option can be correctly used? [duplicate]
(1 answer)
Java using much more memory than heap size (or size correctly Docker memory limit)
(5 answers)
Closed 2 years ago.
I've set the options for a java process to use 80% of 1g max ram. But when I use 'ps -o vsz', I see it is using 3.5g (starting from 2.5g). This causes a lot of swap and thus freezing the device. Why is the discrepancy?
UPDATE: The options to the JVM are now: -Xmx256m -Xshare:off -XX:+UseSerialGC -XX:NativeMemoryTracking=summary -XX:MaxRAM=768m -XX:MaxRAMPercentage=60. They don't seem to change anything. The process starts at 2.4g and grows to 3.5g
UPDATE 2:
openjdk version "14" 2020-03-16
OpenJDK Runtime Environment (build 14+36)
OpenJDK 64-Bit Server VM (build 14+36, mixed mode)
The right option is -Xmx1g:
-X: Option specific to this implementation of java.
mx: max heap memory
1g: 1 gigabyte.
You may want to also apply -Xms1g which sets the minimum. Now the RAM Load of your
java app is stable.
Note that the memload of your VM can still be more than 1GB (though 3.5 sounds excessive): Heap isn't the only memory the VM uses; every thread takes '1 stack' worth, which is also configurable (via -Xss128k for example), so if you have a ton of threads, memory load goes up (with half a meg of stack per thread, 4000 threads imply 1 GB worth of stack memory!). The VM's own runtime stuff also exists outside of heap.
Also, the memory taken by shared libraries needs to be 'bookkept' by the OS; I believe usually OSes just add the full memory load of them to each and every process that uses the shared library, and java tends to claim that it 'uses' most of the ones that are already loaded regardless, which inflates the number as well.
Turns out measuring how much memory an app takes is surprisingly difficult sometimes.
-XX:MaxRam isn't the correct options use -Xmx
I had memory problems with one server. It's an amazon micro instance, so its memory is very limited (free -m says 603 MB). That's why I started tomcat with
-server -Xmx290m -Xms290m -XX:MaxPermSize=65m
However, the "java" process takes around 86% of the total memory, which is 518M. 518-355 = 163 MB overhead. That looks like a lot, and is suspicious, especially given than:
a similar application ran on another jvm version on another micro instance doesn't have overhead this big
the same application run locally gives just 40 MB overhead. Locally it runs in Windows 7, 64 bit.
The java version on the problematic server is:
java version "1.7.0_09-icedtea"
OpenJDK Runtime Environment (amzn-2.3.3.13.amzn1-x86_64)
OpenJDK 64-Bit Server VM (build 23.2-b09, mixed mode)
The big discrepancy between the local runtime and the one on the server makes me exclude the option that there are some expensive off-heap objects (e.g. byte buffers) in the application (and I'm not using any of that anyway). I know that the JVM overhead varies, but having more than 1/2 of the heap as overhead sounds too big. So what could be the reason for that? Or is it a normal way of things?
The choice of GC may impact heap size overhead, since each GC scheme must set aside some memory to manage your heap. Also, on such a small VM, you may not benefit much from going 64bit. A 32bit jvm will take up less heap, even when using CompressOOPS, which should be on by default.
So play with your favourite garbage collectors, pick the one that gives the best mix of overhead and latency for you.
Sorry for asking the question, should have searched a bit more.
Im running weka with a rather large dataset and a memory intesive algoithm. I need all the heap space I cant get!
This works:
java -jar -Xmx2048m weka.jar &
But this does not
java -jar -Xmx4096m weka.jar &
I get:
Error occurred during initialization of VM Could not reserve enough
space for object heap Could not create the Java virtual machine.
By some quick searching I found that this is the upper limit
java -jar -Xmx2594m weka.jar &
I have 4GB ram but a 32 bit machine. Why can't I use 2^32 bytes = 4096MB of memory?
For the future I am wondering if I can run java with e.g. hundreds of GB of heap space if I have the correct hardware and OS?
I have both 1.6 and 1.7 JVM installed:
$java -showversion
java version "1.6.0_24"
OpenJDK Runtime Environment (IcedTea6 1.11.4) (6b24-1.11.4-1ubuntu0.12.04.1)
OpenJDK Server VM (build 20.0-b12, mixed mode)
I have 4GB ram but a 32 bit machine. Why can't I use 2^32 bytes = 4096MB of memory?
For the future I am wondering if I can run java with e.g. hundreds of GB of heap space if I have the correct hardware and OS?
For 4 GB I suggest you use a 64-bit OS and possibly a 64-bit JVM as the limit for the heap size can be as small as 1.2 GB (on Windows XP)
If you want larger JVMs I suggest making sure you have 64-bit OS and JVM and you have more memory than the size of the JVM. e.g. if you want a 40 GB heap you need something like 48 GB or 64 GB of memory.
Use the 64-bit version of Java which allows you to use more memory. This is the limit of the 32-bit Java virtual machine.
If you have 4GB of RAM how can you expect that all will be available to your JVM? What about the OS the JVM is running in, this will also require memory. The way it works is that even though you can address all 4GB generally an OS will limit the amount available per process.
You are not able to have an allocation of 4096m because. It tries to get a single block of 4096m. Which is not possible at any given point of time. So you can use some smaller values between
3000-4000. or make sure your RAM is not used by any of the processes
I'd like to run a very simple bot written in java on my VPS.
I want to limit jvm memory to let's say 10MB (I doubt it would need any more).
I'm running the bot with the following command:
java -Xms5M -Xmx10M -server -jar
IrcBot.jar "/home/jbot"
But top shows that actual memory reserved for java is 144m (or am I interpreting things wrong here?).
13614 jbot 17 0 144m 16m 6740
S 0.0 3.2 0:00.20 java
Any ideas what can be wrong here?
Java version "1.6.0_20" Java(TM) SE Runtime Environment (build 1.6.0_20-b02) Java HotSpot(TM) Client VM (build 16.3-b01, mixed mode)
BTW. I'm running CentOS - if it matters.
EDIT:
Thank you for your answers.
I can't really accept any of them, since it turns out the problem lies within the language i choose to write the program, not the JVM itself.
-Xmx specifies the max Java heap allocation (-Xms specifies the min heap allocation). The Java process has its own overhead (the actual JVM etc), plus the loaded classes and the perm gen space (set via -XX:MaxPermSize=128m) sits outside of that value too.
Think of your heap allocation as simply Java's "internal working space", not the process as a whole.
Try experimenting and you'll see what I mean:
java -Xms512m -Xmx1024m ...
Also, try using a tool such as JConsole or JVisualVM (both are shipped with the Sun / Oracle JDK) and you'll be able to see graphical representations of the actual heap usage (and the settings you used to constrain the size).
Finally, as #Peter Lawrey very rightly states, the resident memory is the crucial figure here - in your case the JVM is only using 16 MiB RSS (according to 'top'). The shared / virtual allocation won't cause any issues as long as the JVM's heap isn't pushed into swap (non-RAM). Again, as I've stated in some of the comments, there are other JVM's available - "Java" is quite capable of running on low resource or embedded platforms.
Xmx is the max heap size, but besides that there's a few other things that the JVM needs to keep in memory: the stack, the classes, etc. For a brief introduction see this post about the JVM memory structure.
The JVM maps in shared libraries which are about 150m. The amount of virtual memory used is unlikely to be important to use if you are trying to minimise physical main memory.
The number you want to look at is the resident memory which is amount of physical main memory actually used (which is 16 MB)
I have defined -Xmsx 1.3GB in the java VM parameters and my Eclipse does not allow more than this, when running the application I got the below exception:
Exception in thread "Thread-3" java.lang.OutOfMemoryError: Java heap space
What can I do?
You can set the maximum memory eclipse uses with -mx1300m or the like. This limitation will be because you are running 32-bit java on Windows. On a 64-bit OS, you won't have this problem.
However, its the maximum memory size you set for each application in eclipse which matters. What have you set in your run options in eclipse?
Your question is very unclear:
Are you running the application in a new JVM?
Did you set the -Xmx / -Xms parameters in the launcher for the child JVM?
If the answer to either of those questions is "no", then try doing ... both. (In particular, if you don't set at least -Xmx for the child JVM, you'll get the default heap size which is relatively small.)
If the answer to both of those questions is "yes", then the problem is that you are running into the limits of your hardware and/or operating system configuration:
On a typical 32bit Windows, a user process can only address a total 2**31 bytes of virtual memory, and some of that will be used by the JVM binaries, native libraries and various non-heap memory allocations. (On a 32 bit Linux, I believe you can have up to 2**31 + 2**30). The "fix" for this is to use a 64bit OS and a 64bit JVM.
In addition, a JVM is limited on the amount of memory that it can request by the resources of the OS'es virtual memory subsystem. This is typically bounded by the sum of the available RAM and the size of the disc files / partitions used for paging. The "fix" for this is to increase the size of the paging file / partition. Adding more RAM would probably be a good idea too.
You may want to look at the aggressive Heap option http://java.sun.com/docs/hotspot/gc1.4.2/#4.2.2.%20AggressiveHeap|outline
It solved a similiar issue for me.