I am basically trying to compress and pass a Base64 representation of an image selected by a user however the apps crash on different phones with OutOfMemoryError problems. Here's my compression and conversion code:
Bitmap bm = BitmapFactory.decodeFile(filePath);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bm.compress(Bitmap.CompressFormat.JPEG, 100, baos);
byte[] byteArrayImage = baos.toByteArray();
String base64String = Base64.encodeToString(byteArrayImage, Base64.DEFAULT);
This process is also painfully slow and causes the app to crash sometimes.
Here's an exception I got:
java.lang.OutOfMemoryError: Failed to allocate a 5035548 byte allocation with 5011320 free bytes and 4MB until OOM
at dalvik.system.VMRuntime.newNonMovableArray(Native Method)
at android.graphics.BitmapFactory.nativeDecodeAsset(Native Method)
at android.graphics.BitmapFactory.decodeStream(BitmapFactory.java:625)
at android.graphics.BitmapFactory.decodeResourceStream(BitmapFactory.java:460)
at android.graphics.drawable.Drawable.createFromResourceStream(Drawable.java:973)
at android.content.res.Resources.loadDrawableForCookie(Resources.java:2477)
What changes should I make?
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inJustDecodeBounds = true;
options.inSampleSize = 2; //you can also calculate your inSampleSize
options.inJustDecodeBounds = false;
options.inTempStorage = new byte[16 * 1024];
Bitmap bm = BitmapFactory.decodeFile(filePath,options); //changed line code
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bm.compress(Bitmap.CompressFormat.JPEG, 100, baos);
byte[] byteArrayImage = baos.toByteArray();
String base64String = Base64.encodeToString(byteArrayImage, Base64.DEFAULT);
Note : Using android:largeHeap="true" for your application doesn't considered to be a ideal solution.
Here's the extract from google that explains it,
However, the ability to request a large heap is intended only for a
small set of apps that can justify the need to consume more RAM (such
as a large photo editing app). Never request a large heap simply
because you've run out of memory and you need a quick fix—you should
use it only when you know exactly where all your memory is being
allocated and why it must be retained. Yet, even when you're confident
your app can justify the large heap, you should avoid requesting it to
whatever extent possible. Using the extra memory will increasingly be
to the detriment of the overall user experience because garbage
collection will take longer and system performance may be slower when
task switching or performing other common operations.
here's the complete link of the documentation https://developer.android.com/training/articles/memory.html
Edit 1: For Efficient scaling of Images like WhatsApp Image Compression checkout this SO Answer
Try to recycle your bitmap after used it. And set the bitmap to null too. If you want run the garbage collector.
this approach wont work well for very large pictures, say picture taken from a camera. a 13 mp photo is 4128x3096x3 bytes which is about 40 megabytes. that is the size of bitmap alone. if you are creating the base-64 representation on the fly, it would take another 40 megabytes and some more, since base-64 string costs more bytes to store than comparable raw byte array (bitmap).
do you really need to turn it into base 64? if for example you want to upload it can you do it directly via rest api or multipart post request.
if you cant do that perhaps you can split the operation, like per 1 MB, or instead of writing that string into memory you can write it to file and append it after every 1 MB operation?
Related
I have created a program that loads image with FileDialog, resize it, previews it to users, and after button click saves it to a folder.
My problem is:
when I run my program - RAM usage ~50mb
loading 1mb JPG file - RAM usage ~93mb
saving 1mb JPG file - RAM usage ~160mb
I intend this program to be lightweight, but after 3-4 files it occupies 500mb RAM space.
I tried to use System.gc(); every time user saves file, but it reduced RAM usage only by ~10%.
Below is a code with loading and saving images, full code, you can find HERE.
BTW - why when after loading 1mb JPG and then saving it size increase to 10mb?
Code for loading image:
FileDialog imageFinder = new FileDialog((Frame)null, "Select file to open:");
imageFinder.setFile("*.jpg; *.png; *.gif; *.jpeg");
imageFinder.setMode(FileDialog.LOAD);
imageFinder.setVisible(true);
userImagePath = new File(imageFinder.getDirectory()).getAbsolutePath()+"\\"+imageFinder.getFile();
userImagePath = userImagePath.replace("\\", "/");
Code for saving image:
BufferedImage bimage = new BufferedImage(userImage.getWidth(null), userImage.getHeight(null), BufferedImage.TYPE_INT_ARGB);
Graphics2D bGr = bimage.createGraphics();
bGr.drawImage(userImage, 0, 0, null);
bGr.dispose();
try {
BufferedImage bi = bimage;
File outputfile = new File("C:\\Users\\Mariola\\git\\MySQL-viwer\\MySQL viewer\\src\\database_images\\"+userBreedInfo[0]+".jpg");
ImageIO.write(bi, "png", outputfile);
} catch (IOException e1) {
}
}
System.gc()
The "problem" is that ImageIO kind of uses much memory. Then this memory will not be returned to the OS (that's why even a no-need call to System.gc() will not return it) because that's how JVM works.(Java 13 promises that memory will be returned to the OS?) As #Andrew Thompson pointed out in the comment section, if you want less memory consumption take a look at this question. If you run it you will see that with memory limit it will not consume so much. Which actually tells you not to worry about it. JVM will do its magic and will handle memory consumption according to how much memory the OS says its free.
If it still bothers you, you could try to find any ImageIO alternatives that may behave differently. In my opinion though, it does not worth it for your needs. I mean, you just want to save/load an image.
Another worth-to-read question is Why is it bad practice to call System.gc()?
Few days ago I was given a solution of checking collision between two bitmaps that has config_alpha_8. But upon using it I noticed my app started lagging oddly, and when I checked the logs I noticed the garbage collector was spamming every millisecond
I tried removing few lines, and found out what causing the garbage collector going hype sh*t were these lines:
byte[] pixelData = getPixels(bitmap1);
byte[] pixelData2 = getPixels(bitmap2);
which called this function:
public byte[] getPixels(Bitmap bmp) {
int bytes = bmp.getRowBytes() * bmp.getHeight();
ByteBuffer buffer = ByteBuffer.allocate(bytes);
bmp.copyPixelsToBuffer(buffer);
return buffer.array();
}
Why? What can I do to make it stop?
You are allocating large contiguous blocks of memory (i.e. a byte[]). Depending on how large your images are, this could be accounting for a significant amount of your available heap.
If you are going to be doing a lot of these type of operations, it may be worth considering pooling byte[] instances of fixed sizes to be reused.
I have an Android application that reads a lot of chunks of bytes one by one via network, then combine them into a large buffer. For example,
ByteArrayOutputStream outputStream = new ByteArrayOutputStream( );
while (i < 10) {
// read is an API in a lib that returns byte[].
byte[] bytes = API.read();
outputStream.write(bytes);
i++;
}
...
The question is about the memory for bytes. Is there a way to force Java to use the same chunk of byte for all reads? So it does not have to free and allocate memory too much? Will JAVA runtime optimize the case? Thanks.
The byte[] will be garbage collected. It is not appropriate to use an NIO ByteBuffer in this case as you are getting byte[] anyway, though it could come in handy later.
With each loop iteration, a byte[] is being created and filled with data, read from into the stream, and then is no longer used. Once memory runs low (or earlier depending on how your JVM operates) the array will be deleted and the memory made available.
You need not worry about such things most of the time (unless you are concatenating tons of strings, which is extremely inefficient for this reason).
I am writing bytes of image to ByteArrayOutputStream then sending it over socket.
The problem is, when I do
ImageIO.write(image, "gif", byteArray);
Memory goes up VERY much, kinda memory leak.
I send using this
ImageIO.write(image, "gif", byteArrayO);
byte [] byteArray = byteArrayO.toByteArray();
byteArrayO.flush();
byteArrayO.reset();
Connection.pw.println("" + byteArray.length);
int old = Connection.client.getSendBufferSize();
Connection.client.setSendBufferSize(byteArray.length);
Connection.client.getOutputStream().write(byteArray, 0, byteArray.length);
Connection.client.getOutputStream().flush();
image.flush();
image = null;
byteArrayO = null;
byteArray = null;
System.gc();
Connection.client.setSendBufferSize(old);
As you can see I have tried all ways, the error comes when I write to the ByteArrayOutputStream, not when I transfer it. The receiver does not get any errors.
Any way I can clear the byteArray, and remove everything it has in it from memory? I know reset() does, but it dont in here. I want to dispose of the ByteArrayOutputStream directly when this is done.
#Christoffer Hammarström probably has the best solution, but I'll add this to try to explain the memory usage.
These 2 lines are creating 3 copies of your image data:
ImageIO.write(image, "gif", byteArrayO);
byte [] byteArray = byteArrayO.toByteArray();
After executing this you have one copy of the data stored in image, one copy in the ByteArrayOutputStream and another copy in the byte array (toByteArray() does not return the internal buffer it creates a copy).
Calling reset() does not release the memory inside the ByteArrayOutputStream, it just resets the position counter back to 0. The data is still there.
To allow the memory to be released earlier you could assign each item to null as soon as you have finished with it. This will allow the memory to be collected by the garbage collector if it decides to run earlier. EG:
ImageIO.write(image, "gif", byteArrayO);
image = null;
byte [] byteArray = byteArrayO.toByteArray();
byteArrayO = null;
...
Why do you have to fiddle with the send buffer size? What kind of protocol are you using on top of this socket? It should be just as simple as:
ImageIO.write(image, "gif", Connection.client.getOutputStream());
If you have to use a ByteArrayOutputStream, at least use
byteArrayO.writeTo(Connection.client.getOutputStream())
so you don't make an extra redundant byte[].
This is not quite the answer you want, but something you might wish to consider.
Why not create a pool of byte arrays and resuse them everytime you need to. This will be a little more efficient as you wont be creating new arrays and throwing them away all the time. Using less gc is always a good thing. You will also be able to guarantee that the application has enough memory to operate in all the time.
You can request that the VM to run garbage collection through System.gc() but this is NOT guaranteed to actually happen. The virtual machine performs garbage collection when it decides it is necessary or is an appropriate time.
What you are describing is pretty normal. It has to put the bytes of the image you are creating somewhere.
Instead of memory you can use a FileOutputStream to write the bytes to. You then have to make a FileInputStream to read from the file you wrote to and a loop which reads bytes into a byte array buffer of say 64k size and then writes those bytes to the connection's output stream.
You mention error. If you are getting an error what is the error?
If you use the client JVM (-client argument to java) then the memory might be given back to the OS and the Java process will shrink again. I'm not sure about this.
If you don't like how much memory JAI is using you can try using Sanselan: http://commons.apache.org/imaging/
I am writing remote desktop application. So I am transferring Images from one machine to another machine as byte array through socket. After receiving byte array I convert it into image and draw on a panel. Code looks approximately like below
imageBytes = //read from socket.
InputStream in = new ByteArrayInputStream(imageBytes);
BufferedImage bufferedImage = ImageIO.read(in);
Image image = Toolkit.getDefaultToolkit().createImage(bufferedImage.getSource());
Image scaledImage = image.getScaledInstance(rmdPanel.getWidth(),rmdPanel.getHeight() ,Image.SCALE_FAST);
Graphics graphics = rmdPanel.getGraphics();
graphics.drawImage(scaledImage, 0, 0, rmdPanel.getWidth(),rmdPanel.getHeight(),rmdPanel);
I also store imagebytes till next image comes(for comparison). Now I am getting java out of memory exception in this code(while receiving byte array). I have heap size of 128 mb-512 mb. Image bytes sent are maximum 3mb.
(you don't show the communication code, so i'm just guessing) if you are using ObjectInputStream/ObjectOutputStream over the socket streams, you need to be aware that they cache objects sent over the wire (to avoid resending the same data). sometimes, this is a nice feature, but it can cause problems if objects are held too long. you need to periodically call reset() on the ObjectOutputStream to clear this cache (in your case, possibly after every image send).
of course, the surest way to solve this problem is to attach a memory profiler and see what's using all the memory (or analyze a heap dump).
imageBytes = //read from socket.
InputStream in = new ByteArrayInputStream(imageBytes);
BufferedImage bufferedImage = ImageIO.read(in);
Why read the image bytes into an array? You don't need that. It is costing you at least one extra copy of the data, maybe two if the ByteArrayInputStream copies the byte array. Just do ImageIO.read() straight from the socket.
I think ImageIO.read(...) does some sort of caching of images or their input streams so that may be causing you to run out of memory as you keep reading images.