I've been using Java for a while now, and my typical ritual of setting up a new dev machine requires the norm of downloading and installing the latest JDK from Oracle's site.
This prompted an unusual question today, does it matter if I use the 32bit or 64bit JRE bundle?
From thinking back on it, I've installed both versions before and my normal toolchain plugs happily in (Eclipse). In my day-to-day programming, I do not recall ever having to change something or think about something in a different way just because I was using the 64bit JRE (or targetting the 64bit JRE for that respect).
From my understanding of 64bit vs. 32bit - it really boils down to how the numbers are stored underneath the covers... and I do know that int is a 32 bits and long is 64 bits... same with float being 32 bits and double is 64 bits -- so is it just that Java has abstracted even this subtlety away, and perhaps has been "64 bit compatible" all along?
I'm sure I'm missing something here besides not being able to install a 64 bit JRE onto a 32 bit system.
64-bit vs. 32-bit really boils down to the size of object references, not the size of numbers.
In 32-bit mode, references are four bytes, allowing the JVM to uniquely address 2^32 bytes of memory. This is the reason 32-bit JVMs are limited to a maximum heap size of 4GB (in reality, the limit is smaller due to other JVM and OS overhead, and differs depending on the OS).
In 64-bit mode, references are (surprise) eight bytes, allowing the JVM to uniquely address 2^64 bytes of memory, which should be enough for anybody. JVM heap sizes (specified with -Xmx) in 64-bit mode can be huge.
But 64-bit mode comes with a cost: references are double the size, increasing memory consumption. This is why Oracle introduced "Compressed oops". With compressed oops enabled (which I believe is now the default), object references are shrunk to four bytes, with the caveat that the heap is limited to four billion objects (and 32GB Xmx). Compressed oops are not free: there is a small computational cost to achieve this big reduction in memory consumption.
As a personal preference, I always run the 64-bit JVM at home. The CPU is x64 capable, the OS is too, so I like the JVM to run in 64-bit mode as well.
As you note, primitive numeric types in Java are well-defined.
However, the choice between 32-bit and 64-bit JVMs can matter if your Java application is using native-code libraries, which may be built for use in a 32-bit application, a 64-bit application, or both.
If you have native libraries that support only 32-bit applications, you either need to use a 32-bit JVM, or build 64-bit versions of the libraries.
Depending on context, for local development I will always use a 64-bit JDK. Primarily because I would likely need the whole memory space for builds and the IDE.
That being said for integration to production, I would recommend 32-bit if it is possible. Why?
For some Java EE servers that are licensed for production use, it would depend on some factors like which machine how many cores etc. For WebSphere Liberty Profile specifically, you are also limited to 2GB.
64-bit JREs would take up slightly more memory and if you're trying to constrain it to something like 2GB or better yet 2x 1GB cluster you would have more flex space to work around in without paying a cent.
From https://plumbr.eu/blog/java/should-i-use-32-or-64-bit-jvm
Problem 1: 30-50% of more heap is required on 64-bit. Why so? Mainly
because of the memory layout in 64-bit architecture. First of all –
object headers are 12 bytes on 64-bit JVM. Secondly, object references
can be either 4 bytes or 8 bytes, depending on JVM flags and the size
of the heap. This definitely adds some overhead compared to the 8
bytes on headers on 32-bit and 4 bytes on references. You can also dig
into one of our earlier posts for more information about calculating
the memory consumption of an object.
Problem 2: Longer garbage collection pauses. Building up more heap
means there is more work to be done by GC while cleaning it up from
unused objects. What it means in real life is that you have to be
extra cautious when building heaps larger than 12-16GB. Without fine
tuning and measuring you can easily introduce full GC pauses spanning
several minutes. In applications where latency is not crucial and you
can optimize for throughput only this might be OK, but on most cases
this might become a showstopper.
To limit your impact for your Java EE environment, offload parts of it to other microservices such as ElasticSearch for search, Hazelcast for caching, your database for data storage and keep your Java EE server to host your application core itself rather than running the services inside it.
I think there are two main differences to consider. One has been mentioned here but not the other.
On the one hand, as other mentioned, the memory and data types. 32-bits and 64-bits JVMs use different native data type sizes and memory-address spaces.
64-bits JVMs can allocate (can use) more memory than the 32-bits ones.
64-bits use native datatypes with more capacity but occupy more space. Because that, the same Object may occupy more space too.
For JVMs which the Garbage Collector (GC) freezes the machine, the 64-bits versions may be slower because the GC must check bigger heaps/objects and it takes more time.
There is an IBM presentation explaining these differences.
And on the other hand, the supported native libraries. Java programs that use JNI to access native libraries require different versions depending on the type of JVM.
32-bits JVMs use 32-bits native libraries and 64-bits JVMs use 64bits libraries.
That means that, if your program uses libraries that rely on native code such as SWT, you will need different versions of them. Note in the SWT download page, there are different versions for Linux/Windows 32- and 64-bits. Note that there are different versions of Eclipse (each one with a different version of SWT) for 32- and 64-bits.
Some applications, such as Alloy, are packaged with 32-bits native libraries. They fail with 64-bit JVMs. You can solve these problems just downloading the corresponding 64-bits native libraries and configuring the JNI appropriately.
Do I need to understand the difference between 32-bit JVM and 64-bit JVM?
If you aren’t building a performance critical application, you don’t have to understand the difference. The subtle difference between 32-bit JVM and 64-bit JVM wouldn’t make much difference to your application. You can skip reading further
Does 64-bit JVM perform better than 32-bit JVM?
Most of us think 64-bit is bigger than 32-bit, thus 64-bit JVM performance will be better than 32-bit JVM performance. Unfortunately, it’s not the case. 64-bit JVM can have a small performance degradation than 32-bit JVM. Below is the excerpt from Oracle JDK documentation regarding 64-bit JVM performance:
“Generally, the benefits of being able to address larger amounts of memory come with a small performance loss in 64-bit VMs versus running the same application on a 32-bit VM.
The performance difference comparing an application running on a 64-bit platform versus a 32-bit platform on SPARC is on the order of 10-20% degradation when you move to a 64-bit VM. On AMD64 and EM64T platforms this difference ranges from 0-15% depending on the amount of pointer accessing your application performs.”
Does 32 bit JVM or 64 bit JVM matter anymore.
What are the things to consider when migrating from 32-bit JVM to 64-bit JVM?
a. GC Pause times
The primary reason to migrate from 32-bit JVM to 64-bit JVM is to attain large heap size (i.e. -Xmx). When you increase heap size, automatically your GC pause times will start to go high, because now there is more garbage in the memory to clear up. You need to do proper GC tuning before doing the migration, otherwise your application can experience several seconds to few minutes pause time.
b. Native Library
If your application is using Java Native Interface (JNI) to access native libraries, then you need to upgrade the native libraries as well. Because 32-bit JVM can use only 32-bit native library. Similarly, 64-bit JVM can use only 64-bit native library.
Related
I had memory problems with one server. It's an amazon micro instance, so its memory is very limited (free -m says 603 MB). That's why I started tomcat with
-server -Xmx290m -Xms290m -XX:MaxPermSize=65m
However, the "java" process takes around 86% of the total memory, which is 518M. 518-355 = 163 MB overhead. That looks like a lot, and is suspicious, especially given than:
a similar application ran on another jvm version on another micro instance doesn't have overhead this big
the same application run locally gives just 40 MB overhead. Locally it runs in Windows 7, 64 bit.
The java version on the problematic server is:
java version "1.7.0_09-icedtea"
OpenJDK Runtime Environment (amzn-2.3.3.13.amzn1-x86_64)
OpenJDK 64-Bit Server VM (build 23.2-b09, mixed mode)
The big discrepancy between the local runtime and the one on the server makes me exclude the option that there are some expensive off-heap objects (e.g. byte buffers) in the application (and I'm not using any of that anyway). I know that the JVM overhead varies, but having more than 1/2 of the heap as overhead sounds too big. So what could be the reason for that? Or is it a normal way of things?
The choice of GC may impact heap size overhead, since each GC scheme must set aside some memory to manage your heap. Also, on such a small VM, you may not benefit much from going 64bit. A 32bit jvm will take up less heap, even when using CompressOOPS, which should be on by default.
So play with your favourite garbage collectors, pick the one that gives the best mix of overhead and latency for you.
As far as I understand I can develop 32 or 64 bit applications from Eclipse 32 and Eclipse 64 bit. Is this correct? If so, what are the benefits of running Eclipse 64 bit on a 64 bit JRE?
I will mostly use the Python plugin PyDev to develop Python applications.
The benefits are an increase in available heap/memory.
The downside to the 64-bit JVM is a decrease in performance:
The performance difference comparing an application running on a 64-bit platform versus a 32-bit platform on SPARC is on the order of 10-20% degradation when you move to a 64-bit VM.
This hit to performance is apparently due to frequent use of 64-bit addresses for managing object references; moving twice the data than with 32-bit addresses.
Frankly, if you're not having problems with the restricted memory space, then you likely don't need to worry about it. You can go far with 32-bit JVM/Eclipse.
Any software 64 bit version is superior to 32 bit version in terms of memory addressing (also instruction set). This applies to Java and there fore to eclipse.
For running 64bit eclipse you will need 64bit jdk
Finally, 32bit JDK cannot have more than ~1.5GB of heap space for JVM argument -Xmx. However, 64bit supports much higher values.
While making programs, how do I make sure that a particular application is a 64 bit application or a 32 bit application.
Is it possible to hard-code that the app would be 64 bit or 32 bit respectively and if possible would it be necessary for the 64 bit app to take more memory than usual 32 bit app?
I am aware about the memory limitation with 32 bit applications/hardware.
Java is platform independent. Thus, you shouldn't bother about it. The only difference between 32 and 64 is in the jdk used.
If you use a 64-bit jdk, your app is 64-bit compatible
If you use a 32-bit jdk, your app is 32-bit compatible
if i compile a program on 32 bit JDK and run it on 64bit JVM will it
work?
Yes it will because you are not actually compiling it, you are only creating a bytecode.
When the JVM will load your classes, it is at that time that it checks wether the jdk is 64-bit or 32-bit and makes your jvm a 64-bit or 32-bit respectively.
There is one byte code regardless of whether it will be used for 32-bit or 64-bit.
This means you can use libraries which were compiled on 32-bit machines before there was any 64-bit JVM, on a 64-bit JVM.
64-bit JVMs can use more memory, but not as much as you might think as modern 64-bit JVM use Compressed Oops so that you can use 32-bit references for up to 32 GB of heap. (you can use more off heap memory as well)
Many 32-bit JVM on 32-bit OSes are limited to 1.2 to 1.5 GB of memory. On 64-bit OS this limit might be 2.5 to 3.5 GB depending on the OS. A 64-bit JVM on a 64-bit OS is practically limited to around 1 TB of memory but this limit could be lifted in future (and depends on the OS)
The only difference you can have is if you use a JNI. A JNI shared library is dependant either a 32-bit or 64-bit library and you might only have one (in which case you ca only load it on a JVM of the same bit-ness) or it might behave differently.
Java bytecode is platform independent. Unless you start calling JNI or something like that then you should not worry.
You can tell the JVM whether to run in 32bit or 64bit mode.
There is no such thing as 32 bit / 64 bit java programs.
It's the JVM itself is developed on different architectures, but the Java specification is constatnt, and does not change with the underlying hardware.
After creating java application, build and get the final .jar file (If that application include 3rd party jars, get that jar files also).
Then Use exe4j application and you can easily create your own
.exe file with x64 or x32
You can download exe4j from this link.
We are developing a swing application written by Java which requires only about 128MB memory, and in the short future I don't see it will require much more memory like 4GB. Previously we provide always 3 different releases, one for 32-bit Windows, one for 32-bit Linux and another for 64-bit Linux, with an installer which has JRE included.
The 64-bit version was not used by anyone until couple of weeks ago, and an OutOfMemoryException was reported because the application consumes about 40-50% more memory than the 32-bit version.
My question is, is it needed at all for us to provide the 64-bit version for 64-bit Linux if the application will never need to use more than 4GB memory? We had some quick test which revealed that 32-bit version works also on 64-bit Linux. But I'm not sure what could be the cons we would have, e.g. performance and/or compatibility issue?
If your application provides no improvements for 64-bit host operating systems and is compatible with your 32-bit releases then I see no immediate need to provide it.
However, most, if not all new system are based on x64 architecture where I advocate that also 64-bit software should be the natural default. This needs grows stronger the closer you get to hardware level. I can't tell you how awkard it is to run a virtual operating just to support some 32-bit VPN client.
Promoting the 64-bit client would probably impact your download statistics if you would decide to make it the preferred choice.
check this 64 bit java
Most 32-bit JVMs are limited to around 1.2-1.5 GB.
If you find your application uses much more memory with 64-bit JVM, try the -XX:+UseCompressedOops which tells the 64-bit JVM to use 32-bit references but can still access 32 GB of memory.
My question is, is it needed at all for us to provide the 64-bit version for 64-bit Linux if the application will never need to use more than 4GB memory?
If the application won't ever need that much memory, 64 bit installer / JVM adds no value. On the contrary, it is a poor option because (as you observed) it simply uses more memory and (probably) runs slower as a result.
(Actually, the actual limit will be less than 4GB. Some parts of the 32-bit address space will be unusable, because of hardware architecture issues.)
I suggest you withdraw the 64 bit version, but provide the user with the ability of using a JVM that they have downloaded and installed separately. (Indeed, you should probably do the latter anyway. Embedded copies of the JRE tend to get overlooked when people upgrade to get the latest JVM security fixes ...)
UPDATE (2019) - Java 8 is the last version of Java which Oracle provides for 32 bit platforms. As of Java 11 (the current LTS version), the Oracle-badged distributions for Linux, MacOS, SunOS / SPARC and Windows are 64 bit only.
My advice now would be to migrate your product offerings away from 32 bit as soon as you can. You don't want to be stuck trying to support a product on an EOL'd version of Java.
Granted, 32-bit Java 11 is available from Azul, according to this Q&A. (I noted 32 bit "Zulu" releases of Java 11 for Linux, but not for Windows or MacOS. YMMV.)
My Java program needs lots of memory to run, the 32-bit version of Java max out at 1.5 GB, my system has 4 GB of RAM, so I decided to run it under the 64-bit version of Java, and yet the JDIC won't work, which affects my program, so I wonder if anyone knows when the 64-bit JDIC will be available ?
Assuming you are running a 64-bit OS and a 64-bit JVM, you could try the latest version of Java 6u13 as I believe there is improve client side support for 64-bit applications.
Note: I wouldn't suggest you use more than 1/2 to 2/3 of your memory for Java's heap. You may also need more main memory to see a real improvement in the amount of memory you can really use.
(As you need to leave memory for the OS, other programs and Java's own shared memory/libraries etc)
I only used Desktop (org.jdesktop.jdic.desktop.Desktop) from JDIC and now I can use java.awt.Desktop from Java 6. There's a few differences but essentially they both have the same functionality.