I am trying to create an mxgraph and a image from the created mxgraph in JAVA. Below is the code to create the image from mxgraph.
BufferedImage image = mxCellRenderer.createBufferedImage(graph,
null, 1, Color.WHITE, graphComponent.isAntiAlias(), null,
graphComponent.getCanvas());
// Creates the URL-encoded XML data
mxCodec codec = new mxCodec();
String xml = URLEncoder.encode(mxXmlUtils.getXml(codec.encode(graph.getModel())), "UTF-8");
mxPngEncodeParam param = mxPngEncodeParam.getDefaultEncodeParam(image);
param.setCompressedText(new String[] { "mxGraphModel", xml });
//Saves as a PNG file
outputStream = new FileOutputStream(new File("graph.jpg"));
ImageIO.write(image, "jpg", outputStream);
outputStream.close();
image = null;
I am using hierarchical layout in the graph.
But I am getting the out of memory error on creating the image for larger graph.
How can i get rid of this memory issue (apart from increasing the heap size)?
Is there any other alternate way to solve this problem (apart from increasing the heap size)?
See this post here:
http://forum.jgraph.com/questions/5408/save-as-png-detect-out-of-memory
especially the bottom part. There's a check in JGraphX which determines if there's enough memory. That one is wrong. There may not be enough memory because the GC hasn't run yet. If the GC runs, then memory would be freed and the createBufferedImage method could be successful. So instead of checking for the free memory, the memory should have just been allocated in a try { ... } catch( Error err} { ... } block.
Related
Firstly, I understand questions regarding java.lang.OutOfMemoryError and Bitmaps have already been asked numerous times before. I have also checked out the Displaying Bitmaps Efficiently page.
My use case:
I am storing two different sized Bitmaps as Strings in an SQLite database. The first size of the Bitmaps is 50% of the screen width, and the second size is 100% of the screen width.
I am using a RecyclerView which displays the images in ImageViews which are either 50% or 100% of the screen width, so the Bitmaps being loaded are no bigger than they need to be, and they are appropriately sized before the images are retrieved from the Database.
I am also loading the Bitmaps using an AsyncTask.
I have over 180 different items in the RecyclerView so I have a total of over 360 Bitmaps (i.e. numberOfimages * theDifferentSizesOfEachImage) being created. I am coverting the String versions of the images into byte arrays via this code: byte [] byteArray = Base64.decode(encodedString, Base64.DEFAULT);
The Problem
The Activity was able to load over around 170 different images without encurring the java.lang.OutOfMemoryError, unless I restarted the same Activity (e.g. load the Activity, then recreate the Activity by clicking on it again in the Navigation Drawer and then repeating that process) and incurred the java.lang.OutOfMemoryError whilst converting the Strings into byte arrays.
I am converting the byte array to a Bitmap using the Glide library using the following code:
Bitmap bitmap = Glide.with(context).load(byteArray).asBitmap()
.dontTransform().dontAnimate().skipMemoryCache(true)
.diskCacheStrategy(DiskCacheStrategy.NONE).into(-1, -1).get();
My Question in a nutshell
How do I avoid the java.lang.OutOfMemoryError occurring whilst am I converting the Strings into byte arrays?
Note
After creating a Bitmap using Glide I am calling recycle() on the given Bitmap and then setting it to null.
I am also already using android:largeHeap="true" in my Android Manifest
Thanks in advance
Edit
Here is how I am creating the Bitmap Strings:
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.WEBP,100, baos);
byte [] b =baos.toByteArray();
String bitmapString = Base64.encodeToString(b, Base64.DEFAULT);
What i would suggest you to drop your approach, this would just not work with large set of image and you are already putting to much work on your thread to handle. One should never store image like that in the sqllite.
You should just convert your bitmap to a file having unique name or could be same (depends upon your use case) then you can just save this file inside the app directory and save the file path in database. Here is some code to help you.
File pictureFile = getOutputMediaFile(getActivity(), MEDIA_TYPE_IMAGE);
if (pictureFile == null) {
return;
}
Bitmap bitmap =BitmapFactory.decodeByteArray(data,0,data.length);
FileOutputStream fos = new FileOutputStream(pictureFile);
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, fos);
fos.flush();
fos.close();
private File getOutputMediaFile(Context context, int m) {
File mediaStorageDir = context.getFilesDir();
if (!mediaStorageDir.exists()) {
if (!mediaStorageDir.mkdirs()) {
Log.d("Fade", "failed to create directory");
return null;
}
}
// Create a media file name
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss")
.format(new Date());
File mediaFile=new File(mediaStorageDir.getPath()+File.separator
+ "IMG_" + timeStamp + ".JPG");
return mediaFile;
}
Now you have the file and now you can just store the file path in the database and when its needed you can always get your file from the storage using glide. This would also make your database fast to queries.
This way you wont need any changes in gradle or anywhere else. Try this.
I have 13 .xlsx files with about 1000 rows in each of them. Now I want to merge it to one .xlsx file with one sheet. I'm using code from here
https://blog.sodhanalibrary.com/2014/11/merge-excel-files-using-java.html#.Vi9ns36rSUk.
Here's my code (few changes, addSheet method unchanged)
try {
FileInputStream excellFile1 = new FileInputStream(new File("tmp_testOut1000.xlsx"));
XSSFWorkbook workbook1 = new XSSFWorkbook(excellFile1);
XSSFSheet sheet1 = workbook1.getSheetAt(0);
for(int i = 2; i < 14; i++){
FileInputStream excellFile2 = new FileInputStream(new File("tmp_testOut" + i + "000.xlsx"));
XSSFWorkbook workbook2 = new XSSFWorkbook(excellFile2);
XSSFSheet sheet2 = workbook2.getSheetAt(0);
System.out.println("add " + i);
addSheet(sheet1, sheet2);
}
excellFile1.close();
// save merged file
System.out.println("merging");
File mergedFile = new File("merged.xlsx");
if (!mergedFile.exists()) {
mergedFile.createNewFile();
}
FileOutputStream out = new FileOutputStream(mergedFile);
System.out.println("write");
workbook1.write(out);
out.close();
System.out.println("Files were merged succussfully");
} catch (Exception e) {
e.printStackTrace();
}
All files are loading and merging but after "write" sysout I'm getting
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.apache.xmlbeans.impl.store.Xobj.new_cursor(Xobj.java:1829)
at org.apache.xmlbeans.impl.values.XmlObjectBase.newCursor(XmlObjectBase.java:293)
at org.apache.xmlbeans.impl.values.XmlComplexContentImpl.arraySetterHelper(XmlComplexContentImpl.java:1151)
at org.openxmlformats.schemas.spreadsheetml.x2006.main.impl.CTFontsImpl.setFontArray(Unknown Source)
at org.apache.poi.xssf.model.StylesTable.writeTo(StylesTable.java:424)
at org.apache.poi.xssf.model.StylesTable.commit(StylesTable.java:496)
at org.apache.poi.POIXMLDocumentPart.onSave(POIXMLDocumentPart.java:341)
at org.apache.poi.POIXMLDocumentPart.onSave(POIXMLDocumentPart.java:345)
at org.apache.poi.POIXMLDocument.write(POIXMLDocument.java:206)
at Start.main(Start.java:275)
What can I do? Why is this happening and how to prevent it?
POI is notoriously memory-hungry, so running out of memory is not uncommon when handling large Excel-files.
When you are able to load all original files and only get trouble writing the merged file you could try using an SXSSFWorkbook instead of an XSSFWorkbook and do regular flushes after adding a certain amount of content (see poi-documentation of the org.apache.poi.xssf.streaming-package). This way you will not have to keep the whole generated file in memory but only small portions.
Try allocating more memory eg.
java -Xmx8192m
Also what you can try is to merge in one xlsx file at a time instead of loading them all at once.
You can also move this line into your for loop:
excellFile1.close();
So you close it right away.
This issue occurs due to the below reason
The java.lang.OutOfMemoryError: GC overhead limit exceeded error is the JVM’s way of signalling that your application spends too much time doing garbage collection with too little result. By default the JVM is configured to throw this error if it spends more than 98% of the total time doing GC and when after the GC only less than 2% of the heap is recovered.
if you just want to neglect this issue you can set the following vm options:
-XX:-UseGCOverheadLimit
Refer link on GC overhead for more information.
You can also use the below switches to assign more heap memory to your application. Run a pilot on your application for some time and identify how much memory would be better for your application
-Xms128m -Xmx512m(these switches sets the initial heap memory size to 128mb and Max memory to 512mb)
If you can avoid using the convenient but memory hungry workbook APIs, work instead with the streaming logic of processing data row by row, which is much more memory efficient.
In particular, pay particular attention to the usage of the:
XSSFReader.SheetIterator for looping over the sheets.
And finally take a good look at the usage of the API: XSSFSheetXMLHandler.
For processing the rows withing a sheet.
See the code on this project:
https://github.com/jeevatkm/excelReader/blob/master/src/main/java/com/myjeeva/poi/ExcelReader.java
You define how you want to process each row by creating your own:
new SheetContentsHandler....
This is quite like SAX parsing, it will not take a bit at your ram.
private void readSheet(StylesTable styles, ReadOnlySharedStringsTable sharedStringsTable,
InputStream sheetInputStream) throws IOException, ParserConfigurationException, SAXException {
SAXParserFactory saxFactory = SAXParserFactory.newInstance();
XMLReader sheetParser = saxFactory.newSAXParser().getXMLReader();
ContentHandler handler =
new XSSFSheetXMLHandler(styles, sharedStringsTable, sheetContentsHandler, true);
sheetParser.setContentHandler(handler);
sheetParser.parse(new InputSource(sheetInputStream));
}
I have some old code that was working until recently, but seems to barf now that it runs on a new server using OpenJDK 6 rather than Java SE 6.
The problem seems to revolve around JAI.create. I have jpeg files which I scale and convert to png files. This code used to work with no leaks, but now that the move has been made to a box running OpenJDK, the file descriptors seem to never close, and I see more and more tmp files accumulate in the tmp directory on the server. These are not files I create, so I assume it is JAI that does it.
Another reason might be the larger heap size on the new server. If JAI cleans up on finalize, but GC happens less frequently, then maybe the files pile up because of that. Reducing the heap size is not an option, and we seem to be having unrelated issues with increasing ulimit.
Here's an example of a file that leaks when I run this:
/tmp/imageio7201901174018490724.tmp
Some code:
// Processor is an internal class that aggregates operations
// performed on the image, like resizing
private byte[] processImage(Processor processor, InputStream stream) {
byte[] bytes = null;
SeekableStream s = null;
try {
// Read the file from the stream
s = SeekableStream.wrapInputStream(stream, true);
RenderedImage image = JAI.create("stream", s);
BufferedImage img = PlanarImage.wrapRenderedImage(image).getAsBufferedImage();
// Process image
if (processor != null) {
image = processor.process(img);
}
// Convert to bytes
bytes = convertToPngBytes(image);
} catch (Exception e){
// error handling
} finally {
// Clean up streams
IOUtils.closeQuietly(stream);
IOUtils.closeQuietly(s);
}
return bytes;
}
private static byte[] convertToPngBytes(RenderedImage image) throws IOException {
ByteArrayOutputStream out = null;
byte[] bytes = null;
try {
out = new ByteArrayOutputStream();
ImageIO.write(image, "png", out);
bytes = out.toByteArray();
} finally {
IOUtils.closeQuietly(out);
}
return bytes;
}
My questions are:
Has anyone run into this and solved it? Since the tmp files created are not mine, I don't know what their names are and thus can't really do anything about them.
What're some of the libraries of choice for resizing and reformatting images? I heard of Scalr - anything else I should look into?
I would rather not rewite the old code at this time, but if there is no other choice...
Thanks!
Just a comment on the temp files/finalizer issue, now that you seem to have solved the root of the problem (too long for a comment, so I'll post it as an answer... :-P):
The temp files are created by ImageIO's FileCacheImageInputStream. These instances are created whenever you call ImageIO.createImageInputStream(stream) and the useCache flag is true (the default). You can set it to false to disable the disk caching, at the expense of in-memory caching. This might make sense as you have a large heap, but probably not if you are processing very large images.
I also think you are (almost) correct about the finalizer issue. You'll find the following ´finalize´ method on FileCacheImageInputStream (Sun JDK 6/1.6.0_26):
protected void finalize() throws Throwable {
// Empty finalizer: for performance reasons we instead use the
// Disposer mechanism for ensuring that the underlying
// RandomAccessFile is closed/deleted prior to garbage collection
}
There's some quite "interesting" code in the class' constructor, that sets up automatic stream closing and disposing when the instance is finalized (should client code forget to do so). This might be different in the OpenJDK implentation, at least it seems kind of hacky. It's also unclear to me at the moment exactly what "performance reasons" we are talking about...
In any case, it seems calling close on the ImageInputStream instance, as you now do, will properly close the file descriptor and delete the temp file.
Found it!
So a stream gets wrapped by another stream in a different area in the code:
iis = ImageIO.createImageInputStream(stream);
And further down, stream is closed.
This doesn't seem to leak any resources when running with Sun Java, but does seem to cause a leak when running with Open JDK.
I'm not sure why that is (I have not looked at source code to verify, though I have my guesses), but that's what seems to be happening. Once I explicitly closed the wrapping stream, all was well.
I'm new to all the memory management subject, so there are a lot of things I don't understand.
I'm trying to cache an image in my app, but I'm having troubles with its memory consumption:
All of the Bitmap Chaching code is pretty much copy-pasted from here: http://developer.android.com/training/displaying-bitmaps/index.html
I debugged the code and checked the heap size in the DDMS view in eclipse, and there is about 15mb jump after these code lines:
options.inJustDecodeBounds = false;
return BitmapFactory.decodeResource(res, resId, options);
in the "decodeSampledBitmapFromResource" method.
The image is 1024x800, 75kb jpg file. According to what I've already seen on the internet, the amount of memory this image is supposed to take is about 1024*800*4(Bytes per pixel)=3.125mb
All of the threads regarding this subject don't say why it's taking much more memory than it should. Is there a way to cache one image with a reasonable amount of memory?
EDIT
I tried using the decodeFile method suggested on #ArshadParwez's answer below. Using this method, after the BitmapFactory.decodeStream method the memory is increased by only 3.5mb - problem solved, sort of, but I want to cache bitmaps directly from the resource.
I noticed that during the decodeResource method there are 2 memory "jumps" - one of about 3.5mb - which is reasonable, and another strange one of 14mb. What are those 14mb used for and why does this happen?
Images are also scaled according to the density so they can use a lot of memory.
For example, if the image file is in the drawable folder (which is mdpi density) and you run it on an xhdpi device, both the width and the height would double. Maybe this link could help you, or this one.
So in your example the bytes the image file would take are :
(1024*2)*(800*2)*4 = 13,107,200 bytes.
It would be even worse if you ran it on an xxhdpi device (like the HTC one and Galaxy S4) .
What can you do? Either put the image file in the correct density folder (drawable-xhdpi or drawable-xxhdpi) or put it in drawable-nodpi (or in the assets folder) and downscale the image according to your needs.
BTW you don't have to set options.inJustDecodeBounds = false since it's the default behavior. In fact you can set null for the bitmap options.
About down scaling you can use either google's way or my way each has its own advantages and disadvantages.
About caching there are many ways to do it. The most common one is LRU cache. There is also an alternative I've created recently (link here or here) that allows you to cache a lot more images and avoid having OOM but it gives you a lot of responsibility.
You can use this method to pass the image and get a bitmap out of it :
public Bitmap decodeFile(File f) {
Bitmap b = null;
try {
// Decode image size
BitmapFactory.Options o = new BitmapFactory.Options();
o.inJustDecodeBounds = true;
FileInputStream fis = new FileInputStream(f);
BitmapFactory.decodeStream(fis, null, o);
fis.close();
int IMAGE_MAX_SIZE = 1000;
int scale = 1;
if (o.outHeight > IMAGE_MAX_SIZE || o.outWidth > IMAGE_MAX_SIZE) {
scale = (int) Math.pow(
2,
(int) Math.round(Math.log(IMAGE_MAX_SIZE
/ (double) Math.max(o.outHeight, o.outWidth))
/ Math.log(0.5)));
}
// Decode with inSampleSize
BitmapFactory.Options o2 = new BitmapFactory.Options();
o2.inSampleSize = scale;
fis = new FileInputStream(f);
b = BitmapFactory.decodeStream(fis, null, o2);
fis.close();
} catch (IOException e) {
e.printStackTrace();
}
return b;
}
#Ori Wasserman: As per your request I used a method to get images from the resource folder and that too I used a 7 MB image. I put the 7 MB image in the "res->drawable" folder and with the following code it didn't crash and the image was shown in the imageview:
Bitmap image = BitmapFactory.decodeResource(getResources(), R.drawable.image_7mb);
loBitmap = Bitmap.createScaledBitmap(image, width_of_screen , height_of_screen, true);
imageview.setImageBitmap(loBitmap);
I'm writing an application which reads and displays images as ImageIcons (within a JLabel), the application needs to be able to support jpegs and bitmaps.
For jpegs I find that passing the filename directly to the ImageIcon constructor works fine (even for displaying two large jpegs), however if I use ImageIO.read to get the image and then pass the image to the ImageIcon constructor, I get an OutOfMemoryError( Java Heap Space ) when the second image is read (using the same images as before).
For bitmaps, if I try to read by passing the filename to ImageIcon, nothing is displayed, however by reading the image with ImageIO.read and then using this image in the ImageIcon constructor works fine.
I understand from reading other forum posts that the reason that the two methods don't work the same for the different formats is down to java's compatability issues with bitmaps, however is there a way around my problem so that I can use the same method for both bitmaps and jpegs without an OutOfMemoryError?
(I would like to avoid having to increase the heap size if possible!)
The OutOfMemoryError is triggered by this line:
img = getFileContentsAsImage(file);
and the method definition is:
public static BufferedImage getFileContentsAsImage(File file) throws FileNotFoundException {
BufferedImage img = null;
try {
ImageIO.setUseCache(false);
img = ImageIO.read(file);
img.flush();
} catch (IOException ex) {
//log error
}
return img;
}
The stack trace is:
Exception in thread "AWT-EventQueue-0" java.lang.OutOfMemoryError: Java heap space
at java.awt.image.DataBufferByte.<init>(DataBufferByte.java:58)
at java.awt.image.ComponentSampleModel.createDataBuffer(ComponentSampleModel.java:397)
at java.awt.image.Raster.createWritableRaster(Raster.java:938)
at javax.imageio.ImageTypeSpecifier.createBufferedImage(ImageTypeSpecifier.java:1056)
at javax.imageio.ImageReader.getDestination(ImageReader.java:2879)
at com.sun.imageio.plugins.jpeg.JPEGImageReader.readInternal(JPEGImageReader.java:925)
at com.sun.imageio.plugins.jpeg.JPEGImageReader.read(JPEGImageReader.java:897)
at javax.imageio.ImageIO.read(ImageIO.java:1422)
at javax.imageio.ImageIO.read(ImageIO.java:1282)
at framework.FileUtils.getFileContentsAsImage(FileUtils.java:33)
You are running out of memory because ImageIO.read() returns an uncompressed BufferedImage which is very large and is retained in the heap because it is referenced by the ImageIcon. However, the images returned by Toolkit.createImage remain in their compressed format (using the private ByteArrayImageSource class.)
You cannot read a BMP using Toolkit.createImage (and even if you could it would still remain uncompressed in memory and you would probably run out of heap space again) but what you can do is read the uncompressed image and save it in a byte array in compressed form, e.g.
public static ImageIcon getPNGIconFromFile(File file) throws IOException {
BufferedImage bitmap = ImageIO.read(file);
ByteArrayOutputStream bytes = new ByteArrayOutputStream();
ImageIO.write(bitmap, "PNG", bytes);
return new ImageIcon(bytes.toByteArray());
}
That way the only time the uncompressed bitmap must be held in memory is when it is being loaded or rendered.
Have you tried this?
ImageIcon im = new ImageIcon(Toolkit.getDefaultToolkit().createImage("filename"));
It couldn't be that you indeed just run out of memory? I mean, does the error still occur if you run java with, say, -Xmx1g ?