I have created a program that loads image with FileDialog, resize it, previews it to users, and after button click saves it to a folder.
My problem is:
when I run my program - RAM usage ~50mb
loading 1mb JPG file - RAM usage ~93mb
saving 1mb JPG file - RAM usage ~160mb
I intend this program to be lightweight, but after 3-4 files it occupies 500mb RAM space.
I tried to use System.gc(); every time user saves file, but it reduced RAM usage only by ~10%.
Below is a code with loading and saving images, full code, you can find HERE.
BTW - why when after loading 1mb JPG and then saving it size increase to 10mb?
Code for loading image:
FileDialog imageFinder = new FileDialog((Frame)null, "Select file to open:");
imageFinder.setFile("*.jpg; *.png; *.gif; *.jpeg");
imageFinder.setMode(FileDialog.LOAD);
imageFinder.setVisible(true);
userImagePath = new File(imageFinder.getDirectory()).getAbsolutePath()+"\\"+imageFinder.getFile();
userImagePath = userImagePath.replace("\\", "/");
Code for saving image:
BufferedImage bimage = new BufferedImage(userImage.getWidth(null), userImage.getHeight(null), BufferedImage.TYPE_INT_ARGB);
Graphics2D bGr = bimage.createGraphics();
bGr.drawImage(userImage, 0, 0, null);
bGr.dispose();
try {
BufferedImage bi = bimage;
File outputfile = new File("C:\\Users\\Mariola\\git\\MySQL-viwer\\MySQL viewer\\src\\database_images\\"+userBreedInfo[0]+".jpg");
ImageIO.write(bi, "png", outputfile);
} catch (IOException e1) {
}
}
System.gc()
The "problem" is that ImageIO kind of uses much memory. Then this memory will not be returned to the OS (that's why even a no-need call to System.gc() will not return it) because that's how JVM works.(Java 13 promises that memory will be returned to the OS?) As #Andrew Thompson pointed out in the comment section, if you want less memory consumption take a look at this question. If you run it you will see that with memory limit it will not consume so much. Which actually tells you not to worry about it. JVM will do its magic and will handle memory consumption according to how much memory the OS says its free.
If it still bothers you, you could try to find any ImageIO alternatives that may behave differently. In my opinion though, it does not worth it for your needs. I mean, you just want to save/load an image.
Another worth-to-read question is Why is it bad practice to call System.gc()?
Related
I am making a Java program.
It involves making a image with size up to 9933 * 14043 pixels (which is A0 size and 300 ppi). The image is 24 bit, so it would take up about 400mb of space. The BufferedImage class some how take more RAM than the bitmap's actual size, so the image would comsume about 600 mb of RAM.
With other data, the app would at max take about 700 mb of ram when rendering the large image. I haven't had any problem with it so far. However, if the end user doesn't have enough free ram, the JVM will not be able to allocate the memory for the bitmap and will throw an OutOfMemoryError.
So what should I do?
I came up with something:
Catch the error and throw prompt to the user.
Wait some time until here's enough memory. If the waiting last too long, throw prompt.
Write my own bitmap class, and write the image by part with an FileOutputStream. The .bmp format is not terribly complicated. (Actually I already wrote and optimized most of it.) By rendering the bitmap by part, the whole image doesn't have to stay in RAM. The size of the parts can be changed dynamically according to the available memory size. However this is kind of reinventing the wheel and takes a significant amount of work. Also, the part of the image that involves text must be placed into a BufferedImage and then converted to my class (because I don't want to look into the true font format). Anyway, if the Java BufferedImage class works in my case, I wouldn't go down this way.
I doubt that anyone has less than a gig of ram nowadays. So you can check if the user has enough memory with Runtime.getRuntime().maxMemory(), and if they don't just show an error and close. Here's an example that uses JOptionPane in the case of an error:
long memory = Runtime.getRuntime().maxMemory(); //in bytes
long required = 700 * 1024 * 1024; //700MB, in bytes
if(memory < required) {
JOptionPane.showMessageDialog(null, "You don't have enough memory. (700MB required)", "Error", JOptionPane.ERROR_MESSAGE);
System.exit(0);
}
maxMemory() returns the maximum amount of memory the JVM will attempt to use (in bytes).
I´m currently trying to figure out where my application has a memory leak. So, I wrote a small test program since my memory leaks seem to be related to the ImageIO.read() method. My test application consists of a simple JFrame with a JButton, which starts the following Action:
public void actionPerformed(ActionEvent e)
{
File folder = new File("C:\\Pictures");
ArrayList<File> files = new ArrayList<File>(Arrays.asList(folder.listFiles()));
try
{
for (File file : files)
ImageIO.read(file);
}
catch (Exception a)
{
a.printStackTrace();
}
}
Although I do NOT save the return value, that is the image, of ImageIO.read, my application has a huge memory allocation (~800 MB). For test reasons, the folder C:\Pictures contains ~23k pictures with a total size of 25GB.
Why does ImageIO.read() reserves that much memory, even after returning and NOT saving the image anywhere else?
It doesn't 'reserve that much memory'. It can't. All that seems to have happened here is that it took about 800 image loads before GC kicked in.
Why are you loading 23k images? It seems a strange thing to do. Where are you going to display them? Do you have an extra-large screen?
I'm currently making an Android App that modifies some bytes of an image. For this, I've written this code:
Bitmap bmp = BitmapFactory.decodeStream(new FileInputStream(path));
ByteBuffer buffer = ByteBuffer.allocate(bmp.getWidth()*bmp.getHeight());
bmp.copyPixelsToBuffer(buffer);
return buffer.array();
The problem is that this way uses too much Heap memory, and throws OutOfMemoryException.
I know that I can make the heap memory for the App bigger, but it doesn't seem like a good design choice.
Is there a more memory-friendly way of changing bytes of an image?
It looks like there are two copies of the pixel data on the managed heap:
The uncompressed data in the Bitmap
The copy of the data in the ByteBuffer
The memory requirement could be halved by leaving the data in the Bitmap and using getPixel() / setPixel() (or perhaps editing a row at a time with the "bulk" variants), but that adds some overhead.
Depending on the nature of the image, you may be able to use a less precise format (e.g. RGB 565 instead of 8888), halving the memory requirement.
As noted in one of the comments, you could uncompress the data to a file, memory-map it with java.nio.channels.FileChannel#map(), and access it through a MappedByteBuffer. This adds a fair bit of overhead to loading and saving, and may be annoying since you have to work through a ByteBuffer rather than a byte[].
Another option is expanding the heap with android:largeHeap (documented here), though in some respects you're just postponing the inevitable: you may be asked to edit an image that is too large for the "large" heap. Also, the capacity of a "large" heap varies from device to device, just as the "normal-sized" heap does. Whether or not this makes sense depends in part on how large the images you're loading are.
Before you do any of this I'd recommend using the heap analysis tools (see e.g. this blog post) to see where your memory is going. Also, look at the logcat above the out-of-memory exception; it should identify the size of the allocation that failed. Make sure it looks "reasonable", i.e. you're not inadvertently allocating significantly more than you think you are.
I am writing bytes of image to ByteArrayOutputStream then sending it over socket.
The problem is, when I do
ImageIO.write(image, "gif", byteArray);
Memory goes up VERY much, kinda memory leak.
I send using this
ImageIO.write(image, "gif", byteArrayO);
byte [] byteArray = byteArrayO.toByteArray();
byteArrayO.flush();
byteArrayO.reset();
Connection.pw.println("" + byteArray.length);
int old = Connection.client.getSendBufferSize();
Connection.client.setSendBufferSize(byteArray.length);
Connection.client.getOutputStream().write(byteArray, 0, byteArray.length);
Connection.client.getOutputStream().flush();
image.flush();
image = null;
byteArrayO = null;
byteArray = null;
System.gc();
Connection.client.setSendBufferSize(old);
As you can see I have tried all ways, the error comes when I write to the ByteArrayOutputStream, not when I transfer it. The receiver does not get any errors.
Any way I can clear the byteArray, and remove everything it has in it from memory? I know reset() does, but it dont in here. I want to dispose of the ByteArrayOutputStream directly when this is done.
#Christoffer Hammarström probably has the best solution, but I'll add this to try to explain the memory usage.
These 2 lines are creating 3 copies of your image data:
ImageIO.write(image, "gif", byteArrayO);
byte [] byteArray = byteArrayO.toByteArray();
After executing this you have one copy of the data stored in image, one copy in the ByteArrayOutputStream and another copy in the byte array (toByteArray() does not return the internal buffer it creates a copy).
Calling reset() does not release the memory inside the ByteArrayOutputStream, it just resets the position counter back to 0. The data is still there.
To allow the memory to be released earlier you could assign each item to null as soon as you have finished with it. This will allow the memory to be collected by the garbage collector if it decides to run earlier. EG:
ImageIO.write(image, "gif", byteArrayO);
image = null;
byte [] byteArray = byteArrayO.toByteArray();
byteArrayO = null;
...
Why do you have to fiddle with the send buffer size? What kind of protocol are you using on top of this socket? It should be just as simple as:
ImageIO.write(image, "gif", Connection.client.getOutputStream());
If you have to use a ByteArrayOutputStream, at least use
byteArrayO.writeTo(Connection.client.getOutputStream())
so you don't make an extra redundant byte[].
This is not quite the answer you want, but something you might wish to consider.
Why not create a pool of byte arrays and resuse them everytime you need to. This will be a little more efficient as you wont be creating new arrays and throwing them away all the time. Using less gc is always a good thing. You will also be able to guarantee that the application has enough memory to operate in all the time.
You can request that the VM to run garbage collection through System.gc() but this is NOT guaranteed to actually happen. The virtual machine performs garbage collection when it decides it is necessary or is an appropriate time.
What you are describing is pretty normal. It has to put the bytes of the image you are creating somewhere.
Instead of memory you can use a FileOutputStream to write the bytes to. You then have to make a FileInputStream to read from the file you wrote to and a loop which reads bytes into a byte array buffer of say 64k size and then writes those bytes to the connection's output stream.
You mention error. If you are getting an error what is the error?
If you use the client JVM (-client argument to java) then the memory might be given back to the OS and the Java process will shrink again. I'm not sure about this.
If you don't like how much memory JAI is using you can try using Sanselan: http://commons.apache.org/imaging/
I am having strange problems with BufferedImage, which in some cases consumes all the free system memory (3GB, 1.5GB free).
I have created a simple wrapper and I use it like this:
public ImageWrapper(final byte[] bytes) throws ImageWrapperException {
this(new ByteArrayInputStream(bytes));
}
public ImageWrapper(final ByteArrayInputStream bis) throws ImageWrapperException {
try {
image = ImageIO.read(bis);
bis.close();
} catch (IOException e) {
throw new ImageWrapperException(e);
}
}
(I have jsut verified that it happens even with image = ImageIO.read(file);)
I am not getting any exceptions until the first "Cannot allocate memory".
For some reason, when reading specific type of images, the reading of the image will end up with all the system memory consumed. I am not talking about heap, but really the system memory.
It happens only in certain environments - it does not happen on my OSX, but it happens on my Debian server.
Do you have an idea why this could be happening?
Are there any alternatives to BufferedImage, possibly working better?
The problematic machine is a Virtual Server. Can it be caused by its configuration?
Thanks
EDIT:
Example image: http://cl.ly/1P430l0V1g133r0C291J
It is just the first and only instance which will produce this.
I have just verified that it also happens with: image = ImageIO.read(file); - I am starting to think, that it must be something outside of Java - some native library which is buggy...
EDIT2:
So the problem is with the FileSystem - where I have a 7GB directory with thousands of images inside. When I try to read a file, it consumes all the memory - I suppose it is some kind of Filesystem issue.
There are some known bugs related to ImageIO.read() and BufferedImage
http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=7166379
http://bugs.sun.com/view_bug.do?bug_id=6716560
There is definitely something wrong with BufferedImage - I've tested it on two servers and it was leaking with the same results - System completely out of memory.
In the end I've written a simple wrapper over PHP and I now use GD for image manipulation. Works fine now. Thanks for all the suggestions!
Try moving the code to java.nio and memory mapped file access. Those are stored outside the heap.
This SO is interesting.