I´m currently trying to figure out where my application has a memory leak. So, I wrote a small test program since my memory leaks seem to be related to the ImageIO.read() method. My test application consists of a simple JFrame with a JButton, which starts the following Action:
public void actionPerformed(ActionEvent e)
{
File folder = new File("C:\\Pictures");
ArrayList<File> files = new ArrayList<File>(Arrays.asList(folder.listFiles()));
try
{
for (File file : files)
ImageIO.read(file);
}
catch (Exception a)
{
a.printStackTrace();
}
}
Although I do NOT save the return value, that is the image, of ImageIO.read, my application has a huge memory allocation (~800 MB). For test reasons, the folder C:\Pictures contains ~23k pictures with a total size of 25GB.
Why does ImageIO.read() reserves that much memory, even after returning and NOT saving the image anywhere else?
It doesn't 'reserve that much memory'. It can't. All that seems to have happened here is that it took about 800 image loads before GC kicked in.
Why are you loading 23k images? It seems a strange thing to do. Where are you going to display them? Do you have an extra-large screen?
Related
I have created a program that loads image with FileDialog, resize it, previews it to users, and after button click saves it to a folder.
My problem is:
when I run my program - RAM usage ~50mb
loading 1mb JPG file - RAM usage ~93mb
saving 1mb JPG file - RAM usage ~160mb
I intend this program to be lightweight, but after 3-4 files it occupies 500mb RAM space.
I tried to use System.gc(); every time user saves file, but it reduced RAM usage only by ~10%.
Below is a code with loading and saving images, full code, you can find HERE.
BTW - why when after loading 1mb JPG and then saving it size increase to 10mb?
Code for loading image:
FileDialog imageFinder = new FileDialog((Frame)null, "Select file to open:");
imageFinder.setFile("*.jpg; *.png; *.gif; *.jpeg");
imageFinder.setMode(FileDialog.LOAD);
imageFinder.setVisible(true);
userImagePath = new File(imageFinder.getDirectory()).getAbsolutePath()+"\\"+imageFinder.getFile();
userImagePath = userImagePath.replace("\\", "/");
Code for saving image:
BufferedImage bimage = new BufferedImage(userImage.getWidth(null), userImage.getHeight(null), BufferedImage.TYPE_INT_ARGB);
Graphics2D bGr = bimage.createGraphics();
bGr.drawImage(userImage, 0, 0, null);
bGr.dispose();
try {
BufferedImage bi = bimage;
File outputfile = new File("C:\\Users\\Mariola\\git\\MySQL-viwer\\MySQL viewer\\src\\database_images\\"+userBreedInfo[0]+".jpg");
ImageIO.write(bi, "png", outputfile);
} catch (IOException e1) {
}
}
System.gc()
The "problem" is that ImageIO kind of uses much memory. Then this memory will not be returned to the OS (that's why even a no-need call to System.gc() will not return it) because that's how JVM works.(Java 13 promises that memory will be returned to the OS?) As #Andrew Thompson pointed out in the comment section, if you want less memory consumption take a look at this question. If you run it you will see that with memory limit it will not consume so much. Which actually tells you not to worry about it. JVM will do its magic and will handle memory consumption according to how much memory the OS says its free.
If it still bothers you, you could try to find any ImageIO alternatives that may behave differently. In my opinion though, it does not worth it for your needs. I mean, you just want to save/load an image.
Another worth-to-read question is Why is it bad practice to call System.gc()?
I am making a Java program.
It involves making a image with size up to 9933 * 14043 pixels (which is A0 size and 300 ppi). The image is 24 bit, so it would take up about 400mb of space. The BufferedImage class some how take more RAM than the bitmap's actual size, so the image would comsume about 600 mb of RAM.
With other data, the app would at max take about 700 mb of ram when rendering the large image. I haven't had any problem with it so far. However, if the end user doesn't have enough free ram, the JVM will not be able to allocate the memory for the bitmap and will throw an OutOfMemoryError.
So what should I do?
I came up with something:
Catch the error and throw prompt to the user.
Wait some time until here's enough memory. If the waiting last too long, throw prompt.
Write my own bitmap class, and write the image by part with an FileOutputStream. The .bmp format is not terribly complicated. (Actually I already wrote and optimized most of it.) By rendering the bitmap by part, the whole image doesn't have to stay in RAM. The size of the parts can be changed dynamically according to the available memory size. However this is kind of reinventing the wheel and takes a significant amount of work. Also, the part of the image that involves text must be placed into a BufferedImage and then converted to my class (because I don't want to look into the true font format). Anyway, if the Java BufferedImage class works in my case, I wouldn't go down this way.
I doubt that anyone has less than a gig of ram nowadays. So you can check if the user has enough memory with Runtime.getRuntime().maxMemory(), and if they don't just show an error and close. Here's an example that uses JOptionPane in the case of an error:
long memory = Runtime.getRuntime().maxMemory(); //in bytes
long required = 700 * 1024 * 1024; //700MB, in bytes
if(memory < required) {
JOptionPane.showMessageDialog(null, "You don't have enough memory. (700MB required)", "Error", JOptionPane.ERROR_MESSAGE);
System.exit(0);
}
maxMemory() returns the maximum amount of memory the JVM will attempt to use (in bytes).
If I run the Scanner code below once, it runs flawlessly.
If I run it a second time, my app crashes and I get an "Out of memory" error in LogCat.
How do I go about freeing up the memory used by the initial run so that the app won't crash on the second run?
Any suggestions would be much appreciated
try
{
myString = new Scanner(new File(myFilePath)).useDelimiter("\\A").next();
}
catch (FileNotFoundException e)
{
e.printStackTrace();
}
Additional misc. info:
The purpose of the code is to load the entire contents of a large (1.5MB) text file into a string.
The exact error message in LogCat is: Out of memory on a 4194320-byte allocation
The code is being run in an AsyncTask background thread.
The try/catch was added automatically by eclipse. I don't know if it's formatted properly or not.
I tried emptying myString to free memory before the second run but that didn't help.
I've tried using other methods to load the file into a string (including the often-recommended apache Utils methods) and settled on this method because it's incredibly fast compared to the other methods I've tried.
I am having strange problems with BufferedImage, which in some cases consumes all the free system memory (3GB, 1.5GB free).
I have created a simple wrapper and I use it like this:
public ImageWrapper(final byte[] bytes) throws ImageWrapperException {
this(new ByteArrayInputStream(bytes));
}
public ImageWrapper(final ByteArrayInputStream bis) throws ImageWrapperException {
try {
image = ImageIO.read(bis);
bis.close();
} catch (IOException e) {
throw new ImageWrapperException(e);
}
}
(I have jsut verified that it happens even with image = ImageIO.read(file);)
I am not getting any exceptions until the first "Cannot allocate memory".
For some reason, when reading specific type of images, the reading of the image will end up with all the system memory consumed. I am not talking about heap, but really the system memory.
It happens only in certain environments - it does not happen on my OSX, but it happens on my Debian server.
Do you have an idea why this could be happening?
Are there any alternatives to BufferedImage, possibly working better?
The problematic machine is a Virtual Server. Can it be caused by its configuration?
Thanks
EDIT:
Example image: http://cl.ly/1P430l0V1g133r0C291J
It is just the first and only instance which will produce this.
I have just verified that it also happens with: image = ImageIO.read(file); - I am starting to think, that it must be something outside of Java - some native library which is buggy...
EDIT2:
So the problem is with the FileSystem - where I have a 7GB directory with thousands of images inside. When I try to read a file, it consumes all the memory - I suppose it is some kind of Filesystem issue.
There are some known bugs related to ImageIO.read() and BufferedImage
http://bugs.sun.com/bugdatabase/view_bug.do?bug_id=7166379
http://bugs.sun.com/view_bug.do?bug_id=6716560
There is definitely something wrong with BufferedImage - I've tested it on two servers and it was leaking with the same results - System completely out of memory.
In the end I've written a simple wrapper over PHP and I now use GD for image manipulation. Works fine now. Thanks for all the suggestions!
Try moving the code to java.nio and memory mapped file access. Those are stored outside the heap.
This SO is interesting.
I'm writing a game that so far has to work with about ~200 PNGs totaling ~14 MB, all with sizes ranging from 250x150 to 600x400 (these are xhdpi drawables, right now I'm letting android do the resizing for lower dpi).
The problem is that if I load them all when the game starts, I get an out of memory error after about 20 seconds, both on the emulator (with 1 GB RAM) and on a Galaxy S device (512 MB RAM).
I've seen games with much bigger resources, some even in the hundreds of MB. How do these handle loading their resources fast without exceeding the memory constraints? 14 MB shouldn't be that much.
Right now I'm using BitmapFactory.decodeResource() to load every bitmap inside a few for loops before the game starts. I also tried using a BitmapFactory.Options with inSampleSize set to 4, which fixes the out of memory problem, but not the slow loading, which still takes about 20 seconds. I would also rather not have to do this, because if I make the images 4 times smaller there is little point in accounting for hdpi and xhdpi screens at all - quality will be a lot worse on these. I also tried using getDrawable(), but it made no difference.
I also considered loading each resource as it is needed (after the game has started), but:
Wouldn't this be slow? - right now the system allocates more memory pretty much between each load, which takes 100-300 ms. This would seriously slow down my frame rate.
If I cache each image so it's only loaded once, eventually I will need all of them in the same level, so this shouldn't fix the out of memory error.
I know a bitmap takes more space in memory than the PNG on the disk - I get the error when allocating about 40 MB. Isn't there a way to just load the 14 MB in memory and build bitmaps from them as they are needed, or something that gives up a little speed (it's a pretty basic 2d game so I don't really need a lot of processing power or max fps) in exchange for a lot more free memory?
Also, I have a lot of small animations with about 10-16 frames. Each one is a separate file. Would there be any benefit in merging them in one single file and loading that? I can't see it helping with memory usage, but could it help with loading times?
Update: I changed my code to load each frame, display it, and then recycle() it. This leads to what I can tell is < 5 FPS, so I don't think I can do this
Try expanding your option (2) with a SoftReference cache of Bitmaps. That way, you would load your Bitmap only once, but if the VM is low on memory this would be the first candidate to be freed. Something like this:
public class MemoryCache {
private HashMap<String, SoftReference<Bitmap>> cache=new HashMap<String, SoftReference<Bitmap>>();
public Bitmap get(String id){
if(!cache.containsKey(id))
return null;
SoftReference<Bitmap> ref=cache.get(id);
return ref.get();
}
public void put(String id, Bitmap bitmap){
cache.put(id, new SoftReference<Bitmap>(bitmap));
}
public void clear() {
cache.clear();
}
}