I am writing an Android application for a single specific phone (which just needs to run on my personal phone) and I need to make use of the following large array, which I want to use to efficiently convert RGB colors to HSV:
RainbowTable = new float[256*256*266][3];
The total size of this array should be 256*256*256*3*4B = 201326592B = 192MB.
When I debug the app, I get an out of memory exception, although approximately 300MB RAM are still free before its execution, according to the Android task manager.
I have already set the large-heap-option to true in the manifest file.
What can I do to prevent this error and preserve the needed amount of RAM?
EDIT: My phone is rooted, so maybe there is a possibility to increase the size of the memory heap per application.
Each device has a maximum per-app RAM cap. If this call in the manifest does not alleviate your problem:
android:largeHeap="true"
Then your only other option is to write your code using the NDK. But that's a pretty hefty thing to dive into, so I would try to figure out an alternative first.
The maximum heap size is device dependent and 192MB is likely to be over the limit allowed by devices at the moment.
However, this answer indicates that you can use the NDK to allocate more memory.
If you already tried this largeHeap=true, I doubt there is a working solution, normally the size of a single memory heap can be maximal 24 - 48 mb depending on the device
You can use ByteBuffer
ByteBuffer byteBuffer = ByteBuffer.allocate( bigSize);
yes this memory issue happens when app uses large memory.
do call System.gc(); this will clear the garbage collector explicitly.
Every app in android uses the limited ammount of memory its about 16mb.
so try this.
Related
In the past I have made most of my memory and performance critical applications with C++ or C#, but with the latest improvements in the Java language I figured I might give it a try. However I am already stuck pretty early with the memory management in Java. More specifically the following two points are really surprising me:
Why do I have to tell the JVM how much memory it can use? Couldn't it just use whatever it wants? I mean... take whatever you need?
Why is it so greedy with memory? Cant it be a bit more generous in giving back memory to the OS? Check the following example:
The example I mentioned above:
Started with 2048mb of heap space
At time T0 the application uses 300mb of RAM
I open and fully read to a byte array a 400mb file -> 700mb of RAM
I wrap that array into a ByteBuffer -> 700mb of RAM
I decrypt it into a second array (cant be done in place) -> 1100mb of RAM
I close the file and clear the ByteBuffer and set the encrypted array to null -> 1100mb of RAM
I parse the decrypted array (which produces a little bit less data) -> 1350mb of RAM
I set the decrypted array to null -> 1350mb of RAM
I wait for a while -> 1350mb of RAM
If I repeat the above with 1028mb of heap space -> OutOfMemoryException
So my question is: Why is Java behaving that way? And more importantly, can I tell the JVM to be a bit more... sane? C# is also a managed language, but it manages to properly free unused memory.
Why do I have to tell the JVM how much memory it can use?
You don't. It has a reasonable default maximum of 1/4 of main memory up to 32 GB.
Couldn't it just use whatever it wants?
It could, you could set the maximum to be all your free memory.
I mean... take whatever you need?
It does this, but up to some maximum you set so it doesn't impact other applications.
Why is it so greedy with memory?
It depends on how you use it.
Cant it be a bit more generous in giving back memory to the OS?
That depends on which GC you use.
Started with 2048mb of heap space
I assume you mean 2 GB or 2048 MB (mb is a milli-bit and a pet hate of mine sorry)
This is a really small amount of memory these days, I assume this is just an example. c.f. my 9 year old has an old desktop of mine with 24 GB of memory.
I open and fully read to a byte array a 400mb file -> 700mb of RAM
I would memory map the file. This uses almost no heap. BTW You can memory map in C and C# too, this is not a trick specific to Java.
I wrap that array into a ByteBuffer -> 700mb of RAM
A memory mapped file is already a ByteBuffer. at this point your heap is no bigger.
I decrypt it into a second array (cant be done in place) -> 1100mb of RAM
I would do this to another "direct" buffer. again no more heap has been used.
I parse the decrypted array (which produces a little bit less data) -> 1350mb of RAM
So this uses an extra 250 MB.
I set the decrypted array to null -> 1350mb of RAM
This changes the reference to be null but nothing else. That is just one of your buffers above in any case.
I wait for a while -> 1350mb of RAM
If you do nothing, you wouldn't expect anything to happen.
If I repeat the above with 1028mb of heap space -> OutOfMemoryException
This is because you are retaining memory and you have less the second time around.
In short, I would
use memory mapped files.
use native buffers
note you can clear these deterministically if you really need to, but usually you don't.
don't touch the maximum heap unless you need to.
And more importantly, can I tell the JVM to be a bit more... sane?
I suspect "sane" is in the eye of the beholder.
BACKGROUND
I recently wrote a java application that consumes a specified amount of MB. I am doing this purposefully to see how another Java application reacts to specific RAM loads (I am sure there are tools for this purpose, but this was the fastest). The memory consumer app is very simple. I enter the number of MB I want to consume and create a vector of that many bytes. I also have a reset button that removes the elements of the vector and prompts for a new number of bytes.
QUESTION
I noticed that the heap size of the java process never reduces once the vector is cleared. I tried clear(), but the heap remains the same size. It seems like the heap grows with the elements, but even though the elements are removed the size remains. Is there a way in java code to reduce heap size? Is there a detail about the java heap that I am missing? I feel like this is an important question because if I wanted to keep a low memory footprint in any java application, I would need a way to keep the heap size from growing or at least not large for long lengths of time.
Try garbage collection by making call to System.gc()
This might help you - When does System.gc() do anything
Calling GC extensively is not recommended.
You should provide max heap size with -Xmx option, and watch memory allocation by you app. Also use weak references for objects which have short time lifecycle and GC remove them automatically.
I sometimes write Python programs which are very difficult to determine how much memory it will use before execution. As such, I sometimes invoke a Python program that tries to allocate massive amounts of RAM causing the kernel to heavily swap and degrade the performance of other running processes.
Because of this, I wish to restrict how much memory a Python heap can grow. When the limit is reached, the program can simply crash. What's the best way to do this?
If it matters, much code is written in Cython, so it should take into account memory allocated there. I am not married to a pure Python solution (it does not need to be portable), so anything that works on Linux is fine.
Check out resource.setrlimit(). It only works on Unix systems but it seems like it might be what you're looking for, as you can choose a maximum heap size for your process and your process's children with the resource.RLIMIT_DATA parameter.
EDIT: Adding an example:
import resource
rsrc = resource.RLIMIT_DATA
soft, hard = resource.getrlimit(rsrc)
print 'Soft limit starts as :', soft
resource.setrlimit(rsrc, (1024, hard)) #limit to one kilobyte
soft, hard = resource.getrlimit(rsrc)
print 'Soft limit changed to :', soft
I'm not sure what your use case is exactly but it's possible you need to place a limit on the size of the stack instead with resouce.RLIMIT_STACK. Going past this limit will send a SIGSEGV signal to your process, and to handle it you will need to employ an alternate signal stack as described in the setrlimit Linux manpage. I'm not sure if sigaltstack is implemented in python, though, so that could prove difficult if you want to recover from going over this boundary.
Have a look at ulimit. It allows resource quotas to be set. May need appropriate kernel settings as well.
Following code allocates memory to specified maximum resident set size
import resource
def set_memory_limit(memory_kilobytes):
# ru_maxrss: peak memory usage (bytes on OS X, kilobytes on Linux)
usage_kilobytes = lambda: resource.getrusage(resource.RUSAGE_SELF).ru_maxrss
rlimit_increment = 1024 * 1024
resource.setrlimit(resource.RLIMIT_DATA, (rlimit_increment, resource.RLIM_INFINITY))
memory_hog = []
while usage_kilobytes() < memory_kilobytes:
try:
for x in range(100):
memory_hog.append('x' * 400)
except MemoryError as err:
rlimit = resource.getrlimit(resource.RLIMIT_DATA)[0] + rlimit_increment
resource.setrlimit(resource.RLIMIT_DATA, (rlimit, resource.RLIM_INFINITY))
set_memory_limit(50 * 1024) # 50 mb
Tested on linux machine.
Is there any Java api available which would help in simulating a fixed amount of memory being used ??
I am building a dummy application that contains no implementations in its methods. All i would like to do within this methods is simulate a certain amount of memory being used up - is this at all possible?
The simplest way to consume a fixed amount of memory is to create a byte array of that size and retain it.
byte[] bytes = new byte[1000*1000]; // use 1 MB of memory.
Could get tricky with the way Java handles memory, considering applications are run through the runtime environment, don't know if it's going to the heap, etc.
One simple way might be loading text files into memory of the specific sizes you want, then somehow making sure they don't get garbage collected once the method returns.
I've written a simple application that works with database. My program have a table to show data from database. When I try to expand frame the program fails with OutOfMemory error, but if i don't try to do this, it works well.
I start my program with -Xmx4m parametre. Does it really need more than 4 megabytes to be in expanded state?
Another question: if I run the java visualVM I see the saw-edged chart of the heap usage of my program while other programs which is using java VM(such as netbeans) have more rectilinear charts. Why is heap usage of my program so unstable even if it does nothing(only waiting for user to push a button)?
You may want to try setting this value to generate a detailed heap dump to show you exactly what is going on.
-XX:+HeapDumpOnOutOfMemoryError
A typical "small" Java desktop application in 2011 is going to run with ~64-128MB. Unless you have a really pressing need, I would start by leaving it set to the default (i.e. no setting).
If you are trying to do something different (e.g. run this on an Android device), you are going to need to get very comfortable with profiling (and you should probably post with that tag).
Keep in mind that your 100 record cache (~12 bytes) may (probably) is double that if you are storing character data (Java uses UCS-16 internally).
RE: the "unstability", the JVM is going handling memory usage for you, and will perform garbage collection according to whatever algos it chooses (these have changed dramatically over the years). The graphing may just be an artifact of the tool and the sample period. The performance in a desktop app is affected by a huge number of factors.
As an example, we once had a huge memory "leak" that only showed up in one automated test but never showed up in normal real world usage. Turned out the test left the mouse hovering over a tool tip which included the name of the open file, which in turn had a set of references back to the entire (huge) project. Wiggling the mouse a few pixels got rid of the tooltip, which meant that the references all cleared up and the garbage collector took out the trash.
Moral of the story? You need to capture the exact heap dump at time of the out-of-memory and review it very carefully.
Why would you set your maximum heap size to 4 megabytes? Java is often memory intensive, so setting it at such a ridiculously low level is a recipe for disaster.
It also depends on how many objects are being created and destroyed by your code, and the underlying Swing (I am assuming) components use components to draw the elements, and how these elements are created and destroyed each time a components is redrawn.
Look at the CellRenderer code and this will show you why objects are being created and destroyed often, and why the garbage collector does such a wonderful job.
Try playing with the Xmx setting and see how the charts flatten out. I would expect Xmx64m or Xmx128m would be suitable (although the amount of data coming out of your database will obviously be an important contributing factor.
You may need more than 4Mb for a GUI with an expanded screen if you are using a double buffer. This will generate multiple image of the UI. It does this to show them quickly on the screen. Usually this is done assuming you have lots and lots of memory.
The Sawtooth memory allocation is due to something being done, then garbage collected. This may be on a repaint operation or other timer. Is there a timer in your code to check some process or value being changed. Or have you added code to a object repaint or other process?
I think 4mb is too small for anything except a trivial program - for example lots of GUI libraries (Swing included) will need to allocate temporary working space for graphics that alone may exceed that amount.
If you want to avoid out of memory errors but also want to avoid over-allocating memory to the JVM, I'd recommend setting a large maximum heap size and a small initial heap size.
Xmx (the maximum heap size) should
generally be quite large, e.g. 256mb
Xms (the initial heap size) can be
much smaller, 4mb should work -
though remember that if the application needs more
than this there will be a temporary performance
hit while it is resized