Related
I am writing a client-side Swing application (graphical font designer) on Java 5. Recently, I am running into java.lang.OutOfMemoryError: Java heap space error because I am not being conservative on memory usage. The user can open unlimited number of files, and the program keeps the opened objects in the memory. After a quick research I found Ergonomics in the 5.0 Java Virtual Machine and others saying on Windows machine the JVM defaults max heap size as 64MB.
Given this situation, how should I deal with this constraint?
I could increase the max heap size using command line option to java, but that would require figuring out available RAM and writing some launching program or script. Besides, increasing to some finite max does not ultimately get rid of the issue.
I could rewrite some of my code to persist objects to file system frequently (using database is the same thing) to free up the memory. It could work, but it's probably a lot work too.
If you could point me to details of above ideas or some alternatives like automatic virtual memory, extending heap size dynamically, that will be great.
Ultimately you always have a finite max of heap to use no matter what platform you are running on. In Windows 32 bit this is around 2GB (not specifically heap but total amount of memory per process). It just happens that Java chooses to make the default smaller (presumably so that the programmer can't create programs that have runaway memory allocation without running into this problem and having to examine exactly what they are doing).
So this given there are several approaches you could take to either determine what amount of memory you need or to reduce the amount of memory you are using. One common mistake with garbage collected languages such as Java or C# is to keep around references to objects that you no longer are using, or allocating many objects when you could reuse them instead. As long as objects have a reference to them they will continue to use heap space as the garbage collector will not delete them.
In this case you can use a Java memory profiler to determine what methods in your program are allocating large number of objects and then determine if there is a way to make sure they are no longer referenced, or to not allocate them in the first place. One option which I have used in the past is "JMP" http://www.khelekore.org/jmp/.
If you determine that you are allocating these objects for a reason and you need to keep around references (depending on what you are doing this might be the case), you will just need to increase the max heap size when you start the program. However, once you do the memory profiling and understand how your objects are getting allocated you should have a better idea about how much memory you need.
In general if you can't guarantee that your program will run in some finite amount of memory (perhaps depending on input size) you will always run into this problem. Only after exhausting all of this will you need to look into caching objects out to disk etc. At this point you should have a very good reason to say "I need Xgb of memory" for something and you can't work around it by improving your algorithms or memory allocation patterns. Generally this will only usually be the case for algorithms operating on large datasets (like a database or some scientific analysis program) and then techniques like caching and memory mapped IO become useful.
Run Java with the command-line option -Xmx, which sets the maximum size of the heap.
See here for details.
You could specify per project how much heap space your project wants
Following is for Eclipse Helios/Juno/Kepler:
Right mouse click on
Run As - Run Configuration - Arguments - Vm Arguments,
then add this
-Xmx2048m
Increasing the heap size is not a "fix" it is a "plaster", 100% temporary. It will crash again in somewhere else. To avoid these issues, write high performance code.
Use local variables wherever possible.
Make sure you select the correct object (EX: Selection between String, StringBuffer and StringBuilder)
Use a good code system for your program(EX: Using static variables VS non static variables)
Other stuff which could work on your code.
Try to move with multy THREADING
Big caveat ---- at my office, we were finding that (on some windows machines) we could not allocate more than 512m for Java heap. This turned out to be due to the Kaspersky anti-virus product installed on some of those machines. After uninstalling that AV product, we found we could allocate at least 1.6gb, i.e, -Xmx1600m (m is mandatory other wise it will lead to another error "Too small initial heap") works.
No idea if this happens with other AV products but presumably this is happening because the AV program is reserving a small block of memory in every address space, thereby preventing a single really large allocation.
I would like to add recommendations from oracle trouble shooting article.
Exception in thread thread_name: java.lang.OutOfMemoryError: Java heap space
The detail message Java heap space indicates object could not be allocated in the Java heap. This error does not necessarily imply a memory leak
Possible causes:
Simple configuration issue, where the specified heap size is insufficient for the application.
Application is unintentionally holding references to objects, and this prevents the objects from being garbage collected.
Excessive use of finalizers.
One other potential source of this error arises with applications that make excessive use of finalizers. If a class has a finalize method, then objects of that type do not have their space reclaimed at garbage collection time
After garbage collection, the objects are queued for finalization, which occurs at a later time. finalizers are executed by a daemon thread that services the finalization queue. If the finalizer thread cannot keep up with the finalization queue, then the Java heap could fill up and this type of OutOfMemoryError exception would be thrown.
One scenario that can cause this situation is when an application creates high-priority threads that cause the finalization queue to increase at a rate that is faster than the rate at which the finalizer thread is servicing that queue.
VM arguments worked for me in eclipse. If you are using eclipse version 3.4, do the following
go to Run --> Run Configurations --> then select the project under maven build --> then select the tab "JRE" --> then enter -Xmx1024m.
Alternatively you could do Run --> Run Configurations --> select the "JRE" tab --> then enter -Xmx1024m
This should increase the memory heap for all the builds/projects. The above memory size is 1 GB. You can optimize the way you want.
Yes, with -Xmx you can configure more memory for your JVM.
To be sure that you don't leak or waste memory. Take a heap dump and use the Eclipse Memory Analyzer to analyze your memory consumption.
Follow below steps:
Open catalina.sh from tomcat/bin.
Change JAVA_OPTS to
JAVA_OPTS="-Djava.awt.headless=true -Dfile.encoding=UTF-8 -server -Xms1536m
-Xmx1536m -XX:NewSize=256m -XX:MaxNewSize=256m -XX:PermSize=256m
-XX:MaxPermSize=256m -XX:+DisableExplicitGC"
Restart your tomcat
By default for development JVM uses small size and small config for other performance related features. But for production you can tune e.g. (In addition it Application Server specific config can exist) -> (If there still isn't enough memory to satisfy the request and the heap has already reached the maximum size, an OutOfMemoryError will occur)
-Xms<size> set initial Java heap size
-Xmx<size> set maximum Java heap size
-Xss<size> set java thread stack size
-XX:ParallelGCThreads=8
-XX:+CMSClassUnloadingEnabled
-XX:InitiatingHeapOccupancyPercent=70
-XX:+UnlockDiagnosticVMOptions
-XX:+UseConcMarkSweepGC
-Xms512m
-Xmx8192m
-XX:MaxPermSize=256m (in java 8 optional)
For example: On linux Platform for production mode preferable settings.
After downloading and configuring server with this way http://www.ehowstuff.com/how-to-install-and-setup-apache-tomcat-8-on-centos-7-1-rhel-7/
1.create setenv.sh file on folder /opt/tomcat/bin/
touch /opt/tomcat/bin/setenv.sh
2.Open and write this params for setting preferable mode.
nano /opt/tomcat/bin/setenv.sh
export CATALINA_OPTS="$CATALINA_OPTS -XX:ParallelGCThreads=8"
export CATALINA_OPTS="$CATALINA_OPTS -XX:+CMSClassUnloadingEnabled"
export CATALINA_OPTS="$CATALINA_OPTS -XX:InitiatingHeapOccupancyPercent=70"
export CATALINA_OPTS="$CATALINA_OPTS -XX:+UnlockDiagnosticVMOptions"
export CATALINA_OPTS="$CATALINA_OPTS -XX:+UseConcMarkSweepGC"
export CATALINA_OPTS="$CATALINA_OPTS -Xms512m"
export CATALINA_OPTS="$CATALINA_OPTS -Xmx8192m"
export CATALINA_OPTS="$CATALINA_OPTS -XX:MaxMetaspaceSize=256M"
3.service tomcat restart
Note that the JVM uses more memory than just the heap. For example
Java methods, thread stacks and native handles are allocated in memory
separate from the heap, as well as JVM internal data structures.
I read somewhere else that you can try - catch java.lang.OutOfMemoryError and on the catch block, you can free all resources that you know might use a lot of memory, close connections and so forth, then do a System.gc() then re-try whatever you were going to do.
Another way is this although, i don't know whether this would work, but I am currently testing whether it will work on my application.
The Idea is to do Garbage collection by calling System.gc() which is known to increase free memory. You can keep checking this after a memory gobbling code executes.
//Mimimum acceptable free memory you think your app needs
long minRunningMemory = (1024*1024);
Runtime runtime = Runtime.getRuntime();
if(runtime.freeMemory()<minRunningMemory)
System.gc();
Easy way to solve OutOfMemoryError in java is to increase the maximum heap size by using JVM options -Xmx512M, this will immediately solve your OutOfMemoryError. This is my preferred solution when I get OutOfMemoryError in Eclipse, Maven or ANT while building project because based upon size of project you can easily ran out of Memory.
Here is an example of increasing maximum heap size of JVM, Also its better to keep -Xmx to -Xms ration either 1:1 or 1:1.5 if you are setting heap size in your java application.
export JVM_ARGS="-Xms1024m -Xmx1024m"
Reference Link
If you came here to search this issue from REACT NATIVE.
Then i guess you should do this
cd android/ && ./gradlew clean && cd ..
Add this line to your gradle.properties file
org.gradle.jvmargs=-Xmx2048m -XX:MaxPermSize=512m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8
It should work. You can change MaxPermSize accordingly to fix your heap problem
I have faced same problem from java heap size.
I have two solutions if you are using java 5(1.5).
just install jdk1.6 and go to the preferences of eclipse and set the jre path of jav1 1.6 as you have installed.
Check your VM argument and let it be whatever it is.
just add one line below of all the arguments present in VM arguments as
-Xms512m -Xmx512m -XX:MaxPermSize=...m(192m).
I think it will work...
If you need to monitor your memory usage at runtime, the java.lang.management package offers MBeans that can be used to monitor the memory pools in your VM (eg, eden space, tenured generation etc), and also garbage collection behaviour.
The free heap space reported by these MBeans will vary greatly depending on GC behaviour, particularly if your application generates a lot of objects which are later GC-ed. One possible approach is to monitor the free heap space after each full-GC, which you may be able to use to make a decision on freeing up memory by persisting objects.
Ultimately, your best bet is to limit your memory retention as far as possible whilst performance remains acceptable. As a previous comment noted, memory is always limited, but your app should have a strategy for dealing with memory exhaustion.
In android studio add/change this line at the end of gradle.properties (Global Properties):
...
org.gradle.jvmargs=-XX\:MaxHeapSize\=1024m -Xmx1024m
if it doesn't work you can retry with bigger than 1024 heap size.
add the below code inside android/gradle.properties:
org.gradle.jvmargs=-Xmx4096m -XX:MaxPermSize=4096m -XX:+HeapDumpOnOutOfMemoryError
org.gradle.daemon=true
org.gradle.parallel=true
org.gradle.configureondemand=true
Note that if you need this in a deployment situation, consider using Java WebStart (with an "ondisk" version, not the network one - possible in Java 6u10 and later) as it allows you to specify the various arguments to the JVM in a cross platform way.
Otherwise you will need an operating system specific launcher which sets the arguments you need.
In my case it solved by assigning more memory to Shared build process heap size in intellij settings.
Go to intellij settings > Compiler > Shared build process heap size
Regarding to netbeans, you could set max heap size to solve the problem.
Go to 'Run', then --> 'Set Project Configuration' --> 'Customise' --> 'run' of its popped up window --> 'VM Option' --> fill in '-Xms2048m -Xmx2048m'.
If you are using Android Studio just add these lines with gradle.properties file
org.gradle.jvmargs=-Xmx2048m -XX:MaxPermSize=512m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8
Android Studio
File -> Invalidate Caches and Restart solved it for me :)
If this issue is happening in Wildfly 8 and JDK1.8,then we need to specify MaxMetaSpace settings instead of PermGen settings.
For example we need to add below configuration in setenv.sh file of wildfly.
JAVA_OPTS="$JAVA_OPTS -XX:MaxMetaspaceSize=256M"
For more information, please check Wildfly Heap Issue
If you keep on allocating & keeping references to object, you will fill up any amount of memory you have.
One option is to do a transparent file close & open when they switch tabs (you only keep a pointer to the file, and when the user switches tab, you close & clean all the objects... it'll make the file change slower... but...), and maybe keep only 3 or 4 files on memory.
Other thing you should do is, when the user opens a file, load it, and intercept any OutOfMemoryError, then (as it is not possible to open the file) close that file, clean its objects and warn the user that he should close unused files.
Your idea of dynamically extending virtual memory doesn't solve the issue, for the machine is limited on resources, so you should be carefull & handle memory issues (or at least, be carefull with them).
A couple of hints i've seen with memory leaks is:
--> Keep on mind that if you put something into a collection and afterwards forget about it, you still have a strong reference to it, so nullify the collection, clean it or do something with it... if not you will find a memory leak difficult to find.
--> Maybe, using collections with weak references (weakhashmap...) can help with memory issues, but you must be carefull with it, for you might find that the object you look for has been collected.
--> Another idea i've found is to develope a persistent collection that stored on database objects least used and transparently loaded. This would probably be the best approach...
Java OOM Heap space issue can also arise when your DB connection pool got full.
I faced this issue because of my Hikari Connection pool (when upgraded to Spring boot 2.4.*) was full and not able to provide connections anymore (all active connections are still pending to fetch results from database).
Issue is some of our native queries in JPA Repositories contain ORDER BY ?#{#pageable} which takes a very long time to get results when upgraded.
Removed ORDER BY ?#{#pageable} from all the native queries in JPA repositories and OOM heap space issue along with connection pool issue got resolved.
If this error occurs right after execution of your junit tests, then you should execute Build -> Rebuild Project.
If this error comes up during APK generation in react-native, cd into the android folder in your project and do:
./gradlew clean
then
./gradlew assembleRelease
If error persists, then, restart your machine.
In Intellij, it worked for me just by giving the "Build Project"
If everything else fails, in addition to increasing the max heap size try also increasing the swap size. For Linux, as of now, relevant instructions can be found in https://linuxize.com/post/create-a-linux-swap-file/.
This can help if you're e.g. compiling something big in an embedded platform.
I am writing a client-side Swing application (graphical font designer) on Java 5. Recently, I am running into java.lang.OutOfMemoryError: Java heap space error because I am not being conservative on memory usage. The user can open unlimited number of files, and the program keeps the opened objects in the memory. After a quick research I found Ergonomics in the 5.0 Java Virtual Machine and others saying on Windows machine the JVM defaults max heap size as 64MB.
Given this situation, how should I deal with this constraint?
I could increase the max heap size using command line option to java, but that would require figuring out available RAM and writing some launching program or script. Besides, increasing to some finite max does not ultimately get rid of the issue.
I could rewrite some of my code to persist objects to file system frequently (using database is the same thing) to free up the memory. It could work, but it's probably a lot work too.
If you could point me to details of above ideas or some alternatives like automatic virtual memory, extending heap size dynamically, that will be great.
Ultimately you always have a finite max of heap to use no matter what platform you are running on. In Windows 32 bit this is around 2GB (not specifically heap but total amount of memory per process). It just happens that Java chooses to make the default smaller (presumably so that the programmer can't create programs that have runaway memory allocation without running into this problem and having to examine exactly what they are doing).
So this given there are several approaches you could take to either determine what amount of memory you need or to reduce the amount of memory you are using. One common mistake with garbage collected languages such as Java or C# is to keep around references to objects that you no longer are using, or allocating many objects when you could reuse them instead. As long as objects have a reference to them they will continue to use heap space as the garbage collector will not delete them.
In this case you can use a Java memory profiler to determine what methods in your program are allocating large number of objects and then determine if there is a way to make sure they are no longer referenced, or to not allocate them in the first place. One option which I have used in the past is "JMP" http://www.khelekore.org/jmp/.
If you determine that you are allocating these objects for a reason and you need to keep around references (depending on what you are doing this might be the case), you will just need to increase the max heap size when you start the program. However, once you do the memory profiling and understand how your objects are getting allocated you should have a better idea about how much memory you need.
In general if you can't guarantee that your program will run in some finite amount of memory (perhaps depending on input size) you will always run into this problem. Only after exhausting all of this will you need to look into caching objects out to disk etc. At this point you should have a very good reason to say "I need Xgb of memory" for something and you can't work around it by improving your algorithms or memory allocation patterns. Generally this will only usually be the case for algorithms operating on large datasets (like a database or some scientific analysis program) and then techniques like caching and memory mapped IO become useful.
Run Java with the command-line option -Xmx, which sets the maximum size of the heap.
See here for details.
You could specify per project how much heap space your project wants
Following is for Eclipse Helios/Juno/Kepler:
Right mouse click on
Run As - Run Configuration - Arguments - Vm Arguments,
then add this
-Xmx2048m
Increasing the heap size is not a "fix" it is a "plaster", 100% temporary. It will crash again in somewhere else. To avoid these issues, write high performance code.
Use local variables wherever possible.
Make sure you select the correct object (EX: Selection between String, StringBuffer and StringBuilder)
Use a good code system for your program(EX: Using static variables VS non static variables)
Other stuff which could work on your code.
Try to move with multy THREADING
Big caveat ---- at my office, we were finding that (on some windows machines) we could not allocate more than 512m for Java heap. This turned out to be due to the Kaspersky anti-virus product installed on some of those machines. After uninstalling that AV product, we found we could allocate at least 1.6gb, i.e, -Xmx1600m (m is mandatory other wise it will lead to another error "Too small initial heap") works.
No idea if this happens with other AV products but presumably this is happening because the AV program is reserving a small block of memory in every address space, thereby preventing a single really large allocation.
I would like to add recommendations from oracle trouble shooting article.
Exception in thread thread_name: java.lang.OutOfMemoryError: Java heap space
The detail message Java heap space indicates object could not be allocated in the Java heap. This error does not necessarily imply a memory leak
Possible causes:
Simple configuration issue, where the specified heap size is insufficient for the application.
Application is unintentionally holding references to objects, and this prevents the objects from being garbage collected.
Excessive use of finalizers.
One other potential source of this error arises with applications that make excessive use of finalizers. If a class has a finalize method, then objects of that type do not have their space reclaimed at garbage collection time
After garbage collection, the objects are queued for finalization, which occurs at a later time. finalizers are executed by a daemon thread that services the finalization queue. If the finalizer thread cannot keep up with the finalization queue, then the Java heap could fill up and this type of OutOfMemoryError exception would be thrown.
One scenario that can cause this situation is when an application creates high-priority threads that cause the finalization queue to increase at a rate that is faster than the rate at which the finalizer thread is servicing that queue.
VM arguments worked for me in eclipse. If you are using eclipse version 3.4, do the following
go to Run --> Run Configurations --> then select the project under maven build --> then select the tab "JRE" --> then enter -Xmx1024m.
Alternatively you could do Run --> Run Configurations --> select the "JRE" tab --> then enter -Xmx1024m
This should increase the memory heap for all the builds/projects. The above memory size is 1 GB. You can optimize the way you want.
Yes, with -Xmx you can configure more memory for your JVM.
To be sure that you don't leak or waste memory. Take a heap dump and use the Eclipse Memory Analyzer to analyze your memory consumption.
Follow below steps:
Open catalina.sh from tomcat/bin.
Change JAVA_OPTS to
JAVA_OPTS="-Djava.awt.headless=true -Dfile.encoding=UTF-8 -server -Xms1536m
-Xmx1536m -XX:NewSize=256m -XX:MaxNewSize=256m -XX:PermSize=256m
-XX:MaxPermSize=256m -XX:+DisableExplicitGC"
Restart your tomcat
By default for development JVM uses small size and small config for other performance related features. But for production you can tune e.g. (In addition it Application Server specific config can exist) -> (If there still isn't enough memory to satisfy the request and the heap has already reached the maximum size, an OutOfMemoryError will occur)
-Xms<size> set initial Java heap size
-Xmx<size> set maximum Java heap size
-Xss<size> set java thread stack size
-XX:ParallelGCThreads=8
-XX:+CMSClassUnloadingEnabled
-XX:InitiatingHeapOccupancyPercent=70
-XX:+UnlockDiagnosticVMOptions
-XX:+UseConcMarkSweepGC
-Xms512m
-Xmx8192m
-XX:MaxPermSize=256m (in java 8 optional)
For example: On linux Platform for production mode preferable settings.
After downloading and configuring server with this way http://www.ehowstuff.com/how-to-install-and-setup-apache-tomcat-8-on-centos-7-1-rhel-7/
1.create setenv.sh file on folder /opt/tomcat/bin/
touch /opt/tomcat/bin/setenv.sh
2.Open and write this params for setting preferable mode.
nano /opt/tomcat/bin/setenv.sh
export CATALINA_OPTS="$CATALINA_OPTS -XX:ParallelGCThreads=8"
export CATALINA_OPTS="$CATALINA_OPTS -XX:+CMSClassUnloadingEnabled"
export CATALINA_OPTS="$CATALINA_OPTS -XX:InitiatingHeapOccupancyPercent=70"
export CATALINA_OPTS="$CATALINA_OPTS -XX:+UnlockDiagnosticVMOptions"
export CATALINA_OPTS="$CATALINA_OPTS -XX:+UseConcMarkSweepGC"
export CATALINA_OPTS="$CATALINA_OPTS -Xms512m"
export CATALINA_OPTS="$CATALINA_OPTS -Xmx8192m"
export CATALINA_OPTS="$CATALINA_OPTS -XX:MaxMetaspaceSize=256M"
3.service tomcat restart
Note that the JVM uses more memory than just the heap. For example
Java methods, thread stacks and native handles are allocated in memory
separate from the heap, as well as JVM internal data structures.
I read somewhere else that you can try - catch java.lang.OutOfMemoryError and on the catch block, you can free all resources that you know might use a lot of memory, close connections and so forth, then do a System.gc() then re-try whatever you were going to do.
Another way is this although, i don't know whether this would work, but I am currently testing whether it will work on my application.
The Idea is to do Garbage collection by calling System.gc() which is known to increase free memory. You can keep checking this after a memory gobbling code executes.
//Mimimum acceptable free memory you think your app needs
long minRunningMemory = (1024*1024);
Runtime runtime = Runtime.getRuntime();
if(runtime.freeMemory()<minRunningMemory)
System.gc();
Easy way to solve OutOfMemoryError in java is to increase the maximum heap size by using JVM options -Xmx512M, this will immediately solve your OutOfMemoryError. This is my preferred solution when I get OutOfMemoryError in Eclipse, Maven or ANT while building project because based upon size of project you can easily ran out of Memory.
Here is an example of increasing maximum heap size of JVM, Also its better to keep -Xmx to -Xms ration either 1:1 or 1:1.5 if you are setting heap size in your java application.
export JVM_ARGS="-Xms1024m -Xmx1024m"
Reference Link
If you came here to search this issue from REACT NATIVE.
Then i guess you should do this
cd android/ && ./gradlew clean && cd ..
Add this line to your gradle.properties file
org.gradle.jvmargs=-Xmx2048m -XX:MaxPermSize=512m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8
It should work. You can change MaxPermSize accordingly to fix your heap problem
I have faced same problem from java heap size.
I have two solutions if you are using java 5(1.5).
just install jdk1.6 and go to the preferences of eclipse and set the jre path of jav1 1.6 as you have installed.
Check your VM argument and let it be whatever it is.
just add one line below of all the arguments present in VM arguments as
-Xms512m -Xmx512m -XX:MaxPermSize=...m(192m).
I think it will work...
If you need to monitor your memory usage at runtime, the java.lang.management package offers MBeans that can be used to monitor the memory pools in your VM (eg, eden space, tenured generation etc), and also garbage collection behaviour.
The free heap space reported by these MBeans will vary greatly depending on GC behaviour, particularly if your application generates a lot of objects which are later GC-ed. One possible approach is to monitor the free heap space after each full-GC, which you may be able to use to make a decision on freeing up memory by persisting objects.
Ultimately, your best bet is to limit your memory retention as far as possible whilst performance remains acceptable. As a previous comment noted, memory is always limited, but your app should have a strategy for dealing with memory exhaustion.
In android studio add/change this line at the end of gradle.properties (Global Properties):
...
org.gradle.jvmargs=-XX\:MaxHeapSize\=1024m -Xmx1024m
if it doesn't work you can retry with bigger than 1024 heap size.
add the below code inside android/gradle.properties:
org.gradle.jvmargs=-Xmx4096m -XX:MaxPermSize=4096m -XX:+HeapDumpOnOutOfMemoryError
org.gradle.daemon=true
org.gradle.parallel=true
org.gradle.configureondemand=true
Note that if you need this in a deployment situation, consider using Java WebStart (with an "ondisk" version, not the network one - possible in Java 6u10 and later) as it allows you to specify the various arguments to the JVM in a cross platform way.
Otherwise you will need an operating system specific launcher which sets the arguments you need.
In my case it solved by assigning more memory to Shared build process heap size in intellij settings.
Go to intellij settings > Compiler > Shared build process heap size
Regarding to netbeans, you could set max heap size to solve the problem.
Go to 'Run', then --> 'Set Project Configuration' --> 'Customise' --> 'run' of its popped up window --> 'VM Option' --> fill in '-Xms2048m -Xmx2048m'.
If you are using Android Studio just add these lines with gradle.properties file
org.gradle.jvmargs=-Xmx2048m -XX:MaxPermSize=512m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8
Android Studio
File -> Invalidate Caches and Restart solved it for me :)
If this issue is happening in Wildfly 8 and JDK1.8,then we need to specify MaxMetaSpace settings instead of PermGen settings.
For example we need to add below configuration in setenv.sh file of wildfly.
JAVA_OPTS="$JAVA_OPTS -XX:MaxMetaspaceSize=256M"
For more information, please check Wildfly Heap Issue
If you keep on allocating & keeping references to object, you will fill up any amount of memory you have.
One option is to do a transparent file close & open when they switch tabs (you only keep a pointer to the file, and when the user switches tab, you close & clean all the objects... it'll make the file change slower... but...), and maybe keep only 3 or 4 files on memory.
Other thing you should do is, when the user opens a file, load it, and intercept any OutOfMemoryError, then (as it is not possible to open the file) close that file, clean its objects and warn the user that he should close unused files.
Your idea of dynamically extending virtual memory doesn't solve the issue, for the machine is limited on resources, so you should be carefull & handle memory issues (or at least, be carefull with them).
A couple of hints i've seen with memory leaks is:
--> Keep on mind that if you put something into a collection and afterwards forget about it, you still have a strong reference to it, so nullify the collection, clean it or do something with it... if not you will find a memory leak difficult to find.
--> Maybe, using collections with weak references (weakhashmap...) can help with memory issues, but you must be carefull with it, for you might find that the object you look for has been collected.
--> Another idea i've found is to develope a persistent collection that stored on database objects least used and transparently loaded. This would probably be the best approach...
Java OOM Heap space issue can also arise when your DB connection pool got full.
I faced this issue because of my Hikari Connection pool (when upgraded to Spring boot 2.4.*) was full and not able to provide connections anymore (all active connections are still pending to fetch results from database).
Issue is some of our native queries in JPA Repositories contain ORDER BY ?#{#pageable} which takes a very long time to get results when upgraded.
Removed ORDER BY ?#{#pageable} from all the native queries in JPA repositories and OOM heap space issue along with connection pool issue got resolved.
If this error occurs right after execution of your junit tests, then you should execute Build -> Rebuild Project.
If this error comes up during APK generation in react-native, cd into the android folder in your project and do:
./gradlew clean
then
./gradlew assembleRelease
If error persists, then, restart your machine.
In Intellij, it worked for me just by giving the "Build Project"
If everything else fails, in addition to increasing the max heap size try also increasing the swap size. For Linux, as of now, relevant instructions can be found in https://linuxize.com/post/create-a-linux-swap-file/.
This can help if you're e.g. compiling something big in an embedded platform.
I am writing a client-side Swing application (graphical font designer) on Java 5. Recently, I am running into java.lang.OutOfMemoryError: Java heap space error because I am not being conservative on memory usage. The user can open unlimited number of files, and the program keeps the opened objects in the memory. After a quick research I found Ergonomics in the 5.0 Java Virtual Machine and others saying on Windows machine the JVM defaults max heap size as 64MB.
Given this situation, how should I deal with this constraint?
I could increase the max heap size using command line option to java, but that would require figuring out available RAM and writing some launching program or script. Besides, increasing to some finite max does not ultimately get rid of the issue.
I could rewrite some of my code to persist objects to file system frequently (using database is the same thing) to free up the memory. It could work, but it's probably a lot work too.
If you could point me to details of above ideas or some alternatives like automatic virtual memory, extending heap size dynamically, that will be great.
Ultimately you always have a finite max of heap to use no matter what platform you are running on. In Windows 32 bit this is around 2GB (not specifically heap but total amount of memory per process). It just happens that Java chooses to make the default smaller (presumably so that the programmer can't create programs that have runaway memory allocation without running into this problem and having to examine exactly what they are doing).
So this given there are several approaches you could take to either determine what amount of memory you need or to reduce the amount of memory you are using. One common mistake with garbage collected languages such as Java or C# is to keep around references to objects that you no longer are using, or allocating many objects when you could reuse them instead. As long as objects have a reference to them they will continue to use heap space as the garbage collector will not delete them.
In this case you can use a Java memory profiler to determine what methods in your program are allocating large number of objects and then determine if there is a way to make sure they are no longer referenced, or to not allocate them in the first place. One option which I have used in the past is "JMP" http://www.khelekore.org/jmp/.
If you determine that you are allocating these objects for a reason and you need to keep around references (depending on what you are doing this might be the case), you will just need to increase the max heap size when you start the program. However, once you do the memory profiling and understand how your objects are getting allocated you should have a better idea about how much memory you need.
In general if you can't guarantee that your program will run in some finite amount of memory (perhaps depending on input size) you will always run into this problem. Only after exhausting all of this will you need to look into caching objects out to disk etc. At this point you should have a very good reason to say "I need Xgb of memory" for something and you can't work around it by improving your algorithms or memory allocation patterns. Generally this will only usually be the case for algorithms operating on large datasets (like a database or some scientific analysis program) and then techniques like caching and memory mapped IO become useful.
Run Java with the command-line option -Xmx, which sets the maximum size of the heap.
See here for details.
You could specify per project how much heap space your project wants
Following is for Eclipse Helios/Juno/Kepler:
Right mouse click on
Run As - Run Configuration - Arguments - Vm Arguments,
then add this
-Xmx2048m
Increasing the heap size is not a "fix" it is a "plaster", 100% temporary. It will crash again in somewhere else. To avoid these issues, write high performance code.
Use local variables wherever possible.
Make sure you select the correct object (EX: Selection between String, StringBuffer and StringBuilder)
Use a good code system for your program(EX: Using static variables VS non static variables)
Other stuff which could work on your code.
Try to move with multy THREADING
Big caveat ---- at my office, we were finding that (on some windows machines) we could not allocate more than 512m for Java heap. This turned out to be due to the Kaspersky anti-virus product installed on some of those machines. After uninstalling that AV product, we found we could allocate at least 1.6gb, i.e, -Xmx1600m (m is mandatory other wise it will lead to another error "Too small initial heap") works.
No idea if this happens with other AV products but presumably this is happening because the AV program is reserving a small block of memory in every address space, thereby preventing a single really large allocation.
I would like to add recommendations from oracle trouble shooting article.
Exception in thread thread_name: java.lang.OutOfMemoryError: Java heap space
The detail message Java heap space indicates object could not be allocated in the Java heap. This error does not necessarily imply a memory leak
Possible causes:
Simple configuration issue, where the specified heap size is insufficient for the application.
Application is unintentionally holding references to objects, and this prevents the objects from being garbage collected.
Excessive use of finalizers.
One other potential source of this error arises with applications that make excessive use of finalizers. If a class has a finalize method, then objects of that type do not have their space reclaimed at garbage collection time
After garbage collection, the objects are queued for finalization, which occurs at a later time. finalizers are executed by a daemon thread that services the finalization queue. If the finalizer thread cannot keep up with the finalization queue, then the Java heap could fill up and this type of OutOfMemoryError exception would be thrown.
One scenario that can cause this situation is when an application creates high-priority threads that cause the finalization queue to increase at a rate that is faster than the rate at which the finalizer thread is servicing that queue.
VM arguments worked for me in eclipse. If you are using eclipse version 3.4, do the following
go to Run --> Run Configurations --> then select the project under maven build --> then select the tab "JRE" --> then enter -Xmx1024m.
Alternatively you could do Run --> Run Configurations --> select the "JRE" tab --> then enter -Xmx1024m
This should increase the memory heap for all the builds/projects. The above memory size is 1 GB. You can optimize the way you want.
Yes, with -Xmx you can configure more memory for your JVM.
To be sure that you don't leak or waste memory. Take a heap dump and use the Eclipse Memory Analyzer to analyze your memory consumption.
Follow below steps:
Open catalina.sh from tomcat/bin.
Change JAVA_OPTS to
JAVA_OPTS="-Djava.awt.headless=true -Dfile.encoding=UTF-8 -server -Xms1536m
-Xmx1536m -XX:NewSize=256m -XX:MaxNewSize=256m -XX:PermSize=256m
-XX:MaxPermSize=256m -XX:+DisableExplicitGC"
Restart your tomcat
By default for development JVM uses small size and small config for other performance related features. But for production you can tune e.g. (In addition it Application Server specific config can exist) -> (If there still isn't enough memory to satisfy the request and the heap has already reached the maximum size, an OutOfMemoryError will occur)
-Xms<size> set initial Java heap size
-Xmx<size> set maximum Java heap size
-Xss<size> set java thread stack size
-XX:ParallelGCThreads=8
-XX:+CMSClassUnloadingEnabled
-XX:InitiatingHeapOccupancyPercent=70
-XX:+UnlockDiagnosticVMOptions
-XX:+UseConcMarkSweepGC
-Xms512m
-Xmx8192m
-XX:MaxPermSize=256m (in java 8 optional)
For example: On linux Platform for production mode preferable settings.
After downloading and configuring server with this way http://www.ehowstuff.com/how-to-install-and-setup-apache-tomcat-8-on-centos-7-1-rhel-7/
1.create setenv.sh file on folder /opt/tomcat/bin/
touch /opt/tomcat/bin/setenv.sh
2.Open and write this params for setting preferable mode.
nano /opt/tomcat/bin/setenv.sh
export CATALINA_OPTS="$CATALINA_OPTS -XX:ParallelGCThreads=8"
export CATALINA_OPTS="$CATALINA_OPTS -XX:+CMSClassUnloadingEnabled"
export CATALINA_OPTS="$CATALINA_OPTS -XX:InitiatingHeapOccupancyPercent=70"
export CATALINA_OPTS="$CATALINA_OPTS -XX:+UnlockDiagnosticVMOptions"
export CATALINA_OPTS="$CATALINA_OPTS -XX:+UseConcMarkSweepGC"
export CATALINA_OPTS="$CATALINA_OPTS -Xms512m"
export CATALINA_OPTS="$CATALINA_OPTS -Xmx8192m"
export CATALINA_OPTS="$CATALINA_OPTS -XX:MaxMetaspaceSize=256M"
3.service tomcat restart
Note that the JVM uses more memory than just the heap. For example
Java methods, thread stacks and native handles are allocated in memory
separate from the heap, as well as JVM internal data structures.
I read somewhere else that you can try - catch java.lang.OutOfMemoryError and on the catch block, you can free all resources that you know might use a lot of memory, close connections and so forth, then do a System.gc() then re-try whatever you were going to do.
Another way is this although, i don't know whether this would work, but I am currently testing whether it will work on my application.
The Idea is to do Garbage collection by calling System.gc() which is known to increase free memory. You can keep checking this after a memory gobbling code executes.
//Mimimum acceptable free memory you think your app needs
long minRunningMemory = (1024*1024);
Runtime runtime = Runtime.getRuntime();
if(runtime.freeMemory()<minRunningMemory)
System.gc();
Easy way to solve OutOfMemoryError in java is to increase the maximum heap size by using JVM options -Xmx512M, this will immediately solve your OutOfMemoryError. This is my preferred solution when I get OutOfMemoryError in Eclipse, Maven or ANT while building project because based upon size of project you can easily ran out of Memory.
Here is an example of increasing maximum heap size of JVM, Also its better to keep -Xmx to -Xms ration either 1:1 or 1:1.5 if you are setting heap size in your java application.
export JVM_ARGS="-Xms1024m -Xmx1024m"
Reference Link
If you came here to search this issue from REACT NATIVE.
Then i guess you should do this
cd android/ && ./gradlew clean && cd ..
Add this line to your gradle.properties file
org.gradle.jvmargs=-Xmx2048m -XX:MaxPermSize=512m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8
It should work. You can change MaxPermSize accordingly to fix your heap problem
I have faced same problem from java heap size.
I have two solutions if you are using java 5(1.5).
just install jdk1.6 and go to the preferences of eclipse and set the jre path of jav1 1.6 as you have installed.
Check your VM argument and let it be whatever it is.
just add one line below of all the arguments present in VM arguments as
-Xms512m -Xmx512m -XX:MaxPermSize=...m(192m).
I think it will work...
If you need to monitor your memory usage at runtime, the java.lang.management package offers MBeans that can be used to monitor the memory pools in your VM (eg, eden space, tenured generation etc), and also garbage collection behaviour.
The free heap space reported by these MBeans will vary greatly depending on GC behaviour, particularly if your application generates a lot of objects which are later GC-ed. One possible approach is to monitor the free heap space after each full-GC, which you may be able to use to make a decision on freeing up memory by persisting objects.
Ultimately, your best bet is to limit your memory retention as far as possible whilst performance remains acceptable. As a previous comment noted, memory is always limited, but your app should have a strategy for dealing with memory exhaustion.
In android studio add/change this line at the end of gradle.properties (Global Properties):
...
org.gradle.jvmargs=-XX\:MaxHeapSize\=1024m -Xmx1024m
if it doesn't work you can retry with bigger than 1024 heap size.
add the below code inside android/gradle.properties:
org.gradle.jvmargs=-Xmx4096m -XX:MaxPermSize=4096m -XX:+HeapDumpOnOutOfMemoryError
org.gradle.daemon=true
org.gradle.parallel=true
org.gradle.configureondemand=true
Note that if you need this in a deployment situation, consider using Java WebStart (with an "ondisk" version, not the network one - possible in Java 6u10 and later) as it allows you to specify the various arguments to the JVM in a cross platform way.
Otherwise you will need an operating system specific launcher which sets the arguments you need.
In my case it solved by assigning more memory to Shared build process heap size in intellij settings.
Go to intellij settings > Compiler > Shared build process heap size
Regarding to netbeans, you could set max heap size to solve the problem.
Go to 'Run', then --> 'Set Project Configuration' --> 'Customise' --> 'run' of its popped up window --> 'VM Option' --> fill in '-Xms2048m -Xmx2048m'.
If you are using Android Studio just add these lines with gradle.properties file
org.gradle.jvmargs=-Xmx2048m -XX:MaxPermSize=512m -XX:+HeapDumpOnOutOfMemoryError -Dfile.encoding=UTF-8
Android Studio
File -> Invalidate Caches and Restart solved it for me :)
If this issue is happening in Wildfly 8 and JDK1.8,then we need to specify MaxMetaSpace settings instead of PermGen settings.
For example we need to add below configuration in setenv.sh file of wildfly.
JAVA_OPTS="$JAVA_OPTS -XX:MaxMetaspaceSize=256M"
For more information, please check Wildfly Heap Issue
If you keep on allocating & keeping references to object, you will fill up any amount of memory you have.
One option is to do a transparent file close & open when they switch tabs (you only keep a pointer to the file, and when the user switches tab, you close & clean all the objects... it'll make the file change slower... but...), and maybe keep only 3 or 4 files on memory.
Other thing you should do is, when the user opens a file, load it, and intercept any OutOfMemoryError, then (as it is not possible to open the file) close that file, clean its objects and warn the user that he should close unused files.
Your idea of dynamically extending virtual memory doesn't solve the issue, for the machine is limited on resources, so you should be carefull & handle memory issues (or at least, be carefull with them).
A couple of hints i've seen with memory leaks is:
--> Keep on mind that if you put something into a collection and afterwards forget about it, you still have a strong reference to it, so nullify the collection, clean it or do something with it... if not you will find a memory leak difficult to find.
--> Maybe, using collections with weak references (weakhashmap...) can help with memory issues, but you must be carefull with it, for you might find that the object you look for has been collected.
--> Another idea i've found is to develope a persistent collection that stored on database objects least used and transparently loaded. This would probably be the best approach...
Java OOM Heap space issue can also arise when your DB connection pool got full.
I faced this issue because of my Hikari Connection pool (when upgraded to Spring boot 2.4.*) was full and not able to provide connections anymore (all active connections are still pending to fetch results from database).
Issue is some of our native queries in JPA Repositories contain ORDER BY ?#{#pageable} which takes a very long time to get results when upgraded.
Removed ORDER BY ?#{#pageable} from all the native queries in JPA repositories and OOM heap space issue along with connection pool issue got resolved.
If this error occurs right after execution of your junit tests, then you should execute Build -> Rebuild Project.
If this error comes up during APK generation in react-native, cd into the android folder in your project and do:
./gradlew clean
then
./gradlew assembleRelease
If error persists, then, restart your machine.
In Intellij, it worked for me just by giving the "Build Project"
If everything else fails, in addition to increasing the max heap size try also increasing the swap size. For Linux, as of now, relevant instructions can be found in https://linuxize.com/post/create-a-linux-swap-file/.
This can help if you're e.g. compiling something big in an embedded platform.
I am new JAVA.
We are using one a tool which is using JAVA Heap size.
Recently we are facing "java.lang.OutOfMemoryError: Java heap space" and tool is geting HUNG.
Now I planning to write a script to get the heap size and if excceds a limt i need to clean the GC so the heap usage wll be low again.
Please help me in this.
Machine OS: Win 2012 Server
Ok. Two things.
If you get an OutOfMemoryError, garbage collection will not help. OutOfMemoryErrors are:
Thrown when the Java Virtual Machine cannot allocate an object because it is out of memory, and no more memory could be made available by the garbage collector. OutOfMemoryError objects may be constructed by the virtual machine as if suppression were disabled and/or the stack trace was not writable.` Source
There are a few JMX implementation for Perl. Here is the first result off Google
However this will not help, because the problem is not what you think it is. The problem is either:
The program requires more heap space than what is currently available. You need to allocate more Heap space for the JVM.
The application is leaking memory (well technically objects, but the result is the same). You need to hire better programmers.
I have a project I'm writing (in Java) for a class where the prof says we're not allowed to use more than 200m
I limit the stack memory to 50m (just to be absolutely sure) with -Xmx50m but according to top, it's still using 300m
I tried running Eclipse Memory Analyzer and it reports only 26m
Could this all be memory on the stack?, I'm pretty sure I never go further than about 300 method calls deep (yes, it is a recursive DFS search), so that would have to mean every stack frame is using up almost a megabyte which seems hard to believe.
The program is single-threaded. Does anyone know any other places in which I might reduce memory usage? Also, how can I check/limit how much memory the stack is using?
UPDATE: I'm using the following JVM options now with no effect (still about 300m according to top): -Xss104k -Xms40m -Xmx40m -XX:MaxPermSize=1k
Another UPDATE: Actually, if I let it run a little bit longer (with all these options) about half the time it suddenly drops to 150m after 4 or 5 seconds (the other half it doesn't drop). What makes this really strange is that my program has no stochastic (and as I said it's single-threaded) so there's no reason it should behave differently on different runs
Could it have something to do with the JVM I'm using?
java version "1.6.0_27"
OpenJDK Runtime Environment (IcedTea6 1.12.3) (6b27-1.12.3-0ubuntu1~10.04)
OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
According to java -h, the default JVM is -server. I tried adding -cacao and now (with all the other options) it's only 59m. So I suppose this solves my problem. Can anyone explain why this was necessary? Also, are there any drawbacks I should know about?
One more update: cacao is really really slow compared to server. This is an awful option
Top command reflects the total amount of memory used by the Java application. This includes among other things:
A basic memory overhead of the JVM itself
the heap space (bounded with -Xmx)
The permanent generation space (-XX:MaxPermSize - not standard in all JVMs)
threads stack space (-Xss per stack) which may grow significantly depending on the number of threads
Space used by native allocations (using ByteBufer class, or JNI)
Max memory = [-Xmx] + [-XX:MaxPermSize] + number_of_threads * [-Xss]
here max heap memory as -Xmx ,min heap memory as -Xms,stack memory as -Xss
and -XX maxPermSize
The following example illustrates this situation. I have launched my tomcat with the following startup parameters:
-Xmx168m -Xms168m -XX:PermSize=32m -XX:MaxPermSize=32m -Xss1m
With -Xmx you are configuring heap size. To configure stack size use -Xss parameter. Sum of those two parameters should be approximately what you want:
-Xmx150m -Xss50m
for example.
Additionally there is also -XX:MaxPermSize parameter which controls. This parameter for -client has default value of 32mb and for -server 64mb. According to your configuration calculate it as well. PermGen space is:
The permanent generation is used to hold reflective of the VM itself such as class objects and method objects.
So basically it stores internal data of the JVM, like classes definitions and intern-ed strings.
At the end I must say that there is one part which you can't control, that is memory used by native java process. Java is program, just like any other, so it uses memory also. If you are watching memory usage in Task Manager you will see this memory as well together with your program memory consumption.
It's important to note that "total memory used" (RSS in Linux land) includes JDK heap (+ other JDK areas) as well as any "native memory" allocated.
For instance, these people found that allocating too many jaxbcontexts (which have associated native memory) between GC's could cause it to use a lot of extra RAM. Another common one is apparently ZipInflater if you don't call close on it (or GZipStream, etc.)
http://sleeplessinslc.blogspot.com/2014/08/jvm-native-memory-leak.html
His final workaround/fix was to either GC "more often" (by using GC1 garbage collector, or specifying a smaller [ironically] -Xmx setting) or by cacheing the JaxBContext objects (since they have no close method so you can't control the leak).
Also note that sometimes you can find memory culprits by just examing jstack: http://javaeesupportpatterns.blogspot.com/2011/09/jaxbcontext-performance-problem-case.html
It's also sometimes possible to "miss" closing for instance GZipStreams accidentally http://kohsuke.org/2011/11/03/quiz-time-memory-leak-in-java
Have you tried using JVisualVM?
http://docs.oracle.com/javase/6/docs/technotes/tools/share/jvisualvm.html
I've often found it helps me track this stuff down. It will show you how much of each kind of memory is being used in even let you drill in and find out what.