How much memory should I use in my Java program? - java

I am making a Java program.
It involves making a image with size up to 9933 * 14043 pixels (which is A0 size and 300 ppi). The image is 24 bit, so it would take up about 400mb of space. The BufferedImage class some how take more RAM than the bitmap's actual size, so the image would comsume about 600 mb of RAM.
With other data, the app would at max take about 700 mb of ram when rendering the large image. I haven't had any problem with it so far. However, if the end user doesn't have enough free ram, the JVM will not be able to allocate the memory for the bitmap and will throw an OutOfMemoryError.
So what should I do?
I came up with something:
Catch the error and throw prompt to the user.
Wait some time until here's enough memory. If the waiting last too long, throw prompt.
Write my own bitmap class, and write the image by part with an FileOutputStream. The .bmp format is not terribly complicated. (Actually I already wrote and optimized most of it.) By rendering the bitmap by part, the whole image doesn't have to stay in RAM. The size of the parts can be changed dynamically according to the available memory size. However this is kind of reinventing the wheel and takes a significant amount of work. Also, the part of the image that involves text must be placed into a BufferedImage and then converted to my class (because I don't want to look into the true font format). Anyway, if the Java BufferedImage class works in my case, I wouldn't go down this way.

I doubt that anyone has less than a gig of ram nowadays. So you can check if the user has enough memory with Runtime.getRuntime().maxMemory(), and if they don't just show an error and close. Here's an example that uses JOptionPane in the case of an error:
long memory = Runtime.getRuntime().maxMemory(); //in bytes
long required = 700 * 1024 * 1024; //700MB, in bytes
if(memory < required) {
JOptionPane.showMessageDialog(null, "You don't have enough memory. (700MB required)", "Error", JOptionPane.ERROR_MESSAGE);
System.exit(0);
}
maxMemory() returns the maximum amount of memory the JVM will attempt to use (in bytes).

Related

High ram usage when loading image with BufferedImage

I have created a program that loads image with FileDialog, resize it, previews it to users, and after button click saves it to a folder.
My problem is:
when I run my program - RAM usage ~50mb
loading 1mb JPG file - RAM usage ~93mb
saving 1mb JPG file - RAM usage ~160mb
I intend this program to be lightweight, but after 3-4 files it occupies 500mb RAM space.
I tried to use System.gc(); every time user saves file, but it reduced RAM usage only by ~10%.
Below is a code with loading and saving images, full code, you can find HERE.
BTW - why when after loading 1mb JPG and then saving it size increase to 10mb?
Code for loading image:
FileDialog imageFinder = new FileDialog((Frame)null, "Select file to open:");
imageFinder.setFile("*.jpg; *.png; *.gif; *.jpeg");
imageFinder.setMode(FileDialog.LOAD);
imageFinder.setVisible(true);
userImagePath = new File(imageFinder.getDirectory()).getAbsolutePath()+"\\"+imageFinder.getFile();
userImagePath = userImagePath.replace("\\", "/");
Code for saving image:
BufferedImage bimage = new BufferedImage(userImage.getWidth(null), userImage.getHeight(null), BufferedImage.TYPE_INT_ARGB);
Graphics2D bGr = bimage.createGraphics();
bGr.drawImage(userImage, 0, 0, null);
bGr.dispose();
try {
BufferedImage bi = bimage;
File outputfile = new File("C:\\Users\\Mariola\\git\\MySQL-viwer\\MySQL viewer\\src\\database_images\\"+userBreedInfo[0]+".jpg");
ImageIO.write(bi, "png", outputfile);
} catch (IOException e1) {
}
}
System.gc()
The "problem" is that ImageIO kind of uses much memory. Then this memory will not be returned to the OS (that's why even a no-need call to System.gc() will not return it) because that's how JVM works.(Java 13 promises that memory will be returned to the OS?) As #Andrew Thompson pointed out in the comment section, if you want less memory consumption take a look at this question. If you run it you will see that with memory limit it will not consume so much. Which actually tells you not to worry about it. JVM will do its magic and will handle memory consumption according to how much memory the OS says its free.
If it still bothers you, you could try to find any ImageIO alternatives that may behave differently. In my opinion though, it does not worth it for your needs. I mean, you just want to save/load an image.
Another worth-to-read question is Why is it bad practice to call System.gc()?

Android: Memory friendly modification of image bytes

I'm currently making an Android App that modifies some bytes of an image. For this, I've written this code:
Bitmap bmp = BitmapFactory.decodeStream(new FileInputStream(path));
ByteBuffer buffer = ByteBuffer.allocate(bmp.getWidth()*bmp.getHeight());
bmp.copyPixelsToBuffer(buffer);
return buffer.array();
The problem is that this way uses too much Heap memory, and throws OutOfMemoryException.
I know that I can make the heap memory for the App bigger, but it doesn't seem like a good design choice.
Is there a more memory-friendly way of changing bytes of an image?
It looks like there are two copies of the pixel data on the managed heap:
The uncompressed data in the Bitmap
The copy of the data in the ByteBuffer
The memory requirement could be halved by leaving the data in the Bitmap and using getPixel() / setPixel() (or perhaps editing a row at a time with the "bulk" variants), but that adds some overhead.
Depending on the nature of the image, you may be able to use a less precise format (e.g. RGB 565 instead of 8888), halving the memory requirement.
As noted in one of the comments, you could uncompress the data to a file, memory-map it with java.nio.channels.FileChannel#map(), and access it through a MappedByteBuffer. This adds a fair bit of overhead to loading and saving, and may be annoying since you have to work through a ByteBuffer rather than a byte[].
Another option is expanding the heap with android:largeHeap (documented here), though in some respects you're just postponing the inevitable: you may be asked to edit an image that is too large for the "large" heap. Also, the capacity of a "large" heap varies from device to device, just as the "normal-sized" heap does. Whether or not this makes sense depends in part on how large the images you're loading are.
Before you do any of this I'd recommend using the heap analysis tools (see e.g. this blog post) to see where your memory is going. Also, look at the logcat above the out-of-memory exception; it should identify the size of the allocation that failed. Make sure it looks "reasonable", i.e. you're not inadvertently allocating significantly more than you think you are.

Large Array - Out of Memory Error

I am writing an Android application for a single specific phone (which just needs to run on my personal phone) and I need to make use of the following large array, which I want to use to efficiently convert RGB colors to HSV:
RainbowTable = new float[256*256*266][3];
The total size of this array should be 256*256*256*3*4B = 201326592B = 192MB.
When I debug the app, I get an out of memory exception, although approximately 300MB RAM are still free before its execution, according to the Android task manager.
I have already set the large-heap-option to true in the manifest file.
What can I do to prevent this error and preserve the needed amount of RAM?
EDIT: My phone is rooted, so maybe there is a possibility to increase the size of the memory heap per application.
Each device has a maximum per-app RAM cap. If this call in the manifest does not alleviate your problem:
android:largeHeap="true"
Then your only other option is to write your code using the NDK. But that's a pretty hefty thing to dive into, so I would try to figure out an alternative first.
The maximum heap size is device dependent and 192MB is likely to be over the limit allowed by devices at the moment.
However, this answer indicates that you can use the NDK to allocate more memory.
If you already tried this largeHeap=true, I doubt there is a working solution, normally the size of a single memory heap can be maximal 24 - 48 mb depending on the device
You can use ByteBuffer
ByteBuffer byteBuffer = ByteBuffer.allocate( bigSize);
yes this memory issue happens when app uses large memory.
do call System.gc(); this will clear the garbage collector explicitly.
Every app in android uses the limited ammount of memory its about 16mb.
so try this.

Java compress/decompress large files (>1gb)

I have made an application in android that lets the user compress and decompress files and I used the package java.util.zip. Everything is okay. the speed, files are totally compressed and decompressed together with the directories. The only problem is that the application is not able to compress/decompress large files (greater than 1gb).
I believe the problem is the size of my buffer. Other codes that I've seen, the value of their buffer is 1024 or 2048 or 8192 but my value of my buffer is base on the size of the chosen file (just to make it flexible). But once the user chose a large file (with a size of >8 digits), that's were the error comes out. I searched over the net and also here in this site but I can't find an answer. my problem is similar to this:
To Compress a big file in a ZIP with Java
Thanks for the future help! :)
EDIT:
Thanks for the comments and answers. It really helped a lot. I thought BUFFER in compressing/decompressing in java means the size of file so in my program, I made the buffer size flexible (buffer size = file size). Will someone please explain how buffer works so I can understand why is it okay that BUFFER has a fixed value. Also for me to figure it out why others people is telling that it is much better if the buffer size is 8k or else. Thanks a lot! :)
If you size the buffer to the size of the file, then it means that you will have OutOfMemoryError whenever the file size is too big for memory available.
Use a normal buffer size and let it do it's work - buffering the data in a streaming fashion, one chunk at a time, rather than all in one go.
For explanation, see for example the documentation of BufferedOutputStream:
The class implements a buffered output stream. By setting up such an
output stream, an application can write bytes to the underlying output
stream without necessarily causing a call to the underlying system for
each byte written.
So using a buffer is more efficient than non-buffered writing.
And from the write method:
Ordinarily this method stores bytes from the given array into this
stream's buffer, flushing the buffer to the underlying output stream
as needed. If the requested length is at least as large as this
stream's buffer, however, then this method will flush the buffer and
write the bytes directly to the underlying output stream.
Each write causes the in-memory buffer to fill up, until the buffer is full. When the buffer is full, it is flushed and cleared. If you use a very large buffer, you will cause a large amount of data to be stored in memory before flushing. If your buffer is the same size as the input file, then you are saying you need to read the whole content into memory before flushing it. Using the default buffer size is usually just fine. There will be more physical writes (flushes); you avoid exploding memory.
By allowing you to specify a specific buffer size, the API is letting you choose the right balance between memory consumption and i/o to suit your application. If you tune your application for performance, you might end up tweaking buffer size. But the default size will be reasonable for many situations.
It sounds like it would help to simply set a maximum size for the buffer, something like:
//After calculating the buffer size bufSize:
bufSize = Math.min(bufSize, MAXSIZE);

Out of memory on resource loading for Android game

I'm writing a game that so far has to work with about ~200 PNGs totaling ~14 MB, all with sizes ranging from 250x150 to 600x400 (these are xhdpi drawables, right now I'm letting android do the resizing for lower dpi).
The problem is that if I load them all when the game starts, I get an out of memory error after about 20 seconds, both on the emulator (with 1 GB RAM) and on a Galaxy S device (512 MB RAM).
I've seen games with much bigger resources, some even in the hundreds of MB. How do these handle loading their resources fast without exceeding the memory constraints? 14 MB shouldn't be that much.
Right now I'm using BitmapFactory.decodeResource() to load every bitmap inside a few for loops before the game starts. I also tried using a BitmapFactory.Options with inSampleSize set to 4, which fixes the out of memory problem, but not the slow loading, which still takes about 20 seconds. I would also rather not have to do this, because if I make the images 4 times smaller there is little point in accounting for hdpi and xhdpi screens at all - quality will be a lot worse on these. I also tried using getDrawable(), but it made no difference.
I also considered loading each resource as it is needed (after the game has started), but:
Wouldn't this be slow? - right now the system allocates more memory pretty much between each load, which takes 100-300 ms. This would seriously slow down my frame rate.
If I cache each image so it's only loaded once, eventually I will need all of them in the same level, so this shouldn't fix the out of memory error.
I know a bitmap takes more space in memory than the PNG on the disk - I get the error when allocating about 40 MB. Isn't there a way to just load the 14 MB in memory and build bitmaps from them as they are needed, or something that gives up a little speed (it's a pretty basic 2d game so I don't really need a lot of processing power or max fps) in exchange for a lot more free memory?
Also, I have a lot of small animations with about 10-16 frames. Each one is a separate file. Would there be any benefit in merging them in one single file and loading that? I can't see it helping with memory usage, but could it help with loading times?
Update: I changed my code to load each frame, display it, and then recycle() it. This leads to what I can tell is < 5 FPS, so I don't think I can do this
Try expanding your option (2) with a SoftReference cache of Bitmaps. That way, you would load your Bitmap only once, but if the VM is low on memory this would be the first candidate to be freed. Something like this:
public class MemoryCache {
private HashMap<String, SoftReference<Bitmap>> cache=new HashMap<String, SoftReference<Bitmap>>();
public Bitmap get(String id){
if(!cache.containsKey(id))
return null;
SoftReference<Bitmap> ref=cache.get(id);
return ref.get();
}
public void put(String id, Bitmap bitmap){
cache.put(id, new SoftReference<Bitmap>(bitmap));
}
public void clear() {
cache.clear();
}
}

Categories

Resources