I need to convert a BufferedImage to a byte[], but It is too slow. The byte is eventually base64 encoded and sent to an android client. The Method I have been using is like this:
public static byte[] ImageToBytes(BufferedImage im) throws IOException
{
//make sure its NN
if(im!=null)
{
//create a ByteArrayOutputStream
ByteArrayOutputStream baos = new ByteArrayOutputStream();
//write our image to it
ImageIO.write( im, "png", baos );
//flush the stream
baos.flush();
//get the image in byte form
byte[] imageInByte = baos.toByteArray();
//close the stream
baos.close();
//return our value encoded in base64
return imageInByte;
}
return null;
}
This is far too slow for my program. Changing the png to jpeg makes it fail on the mobile side. The JpegCodec version also fails on the mobile side. By fail I mean the Android method BitmapFactory.decodeByteArray() returns null.
You don't say why it's too slow and there isn't much that can be done to make this code faster because it's the only way to convert a BufferedImage to a PNG byte stream. But here are some pointers:
The class ByteArrayOutputStream is synchronized which costs a lot of performance. You can find a faster implementation in Commons IO (org.apache.commons.io.output.ByteArrayOutputStream)
Depending on the size of your image, the allocation algorithm of ByteArrayOutputStream might be a problem. Start with an initial size of 1 or 10 KiB.
toByteArray() gives you a copy of the existing buffer. If you don't need that (and usually, you don't), writing your own OutputStream might give you an additional speed boost (not to mention avoiding GC runs)
Would not converting to a byte[] be an option? You say "[it] is eventually base64 encoded and sent to an android client". What about wrapping the OutputStream to write to the Android client in something like Base64OutputStream and avoiding the issue entirely?
Related
I have a lot of operations involving generating a BufferedImage object using a third-party library and saving it to a jpg file. Sometimes one processing can have over 10,000 such savings. Currently I'm using ImageIO.write(image, "jpg", file) directly but the performance is not very satisfactory.
I'm wondering if I can use direct ByteBuffer to make disk writing faster? I'm thinking about putting the BufferedImage into a direct ByteBuffer and finally save to disk using a FileChannel. I didn't find a way to do so. I wonder how can I put a BufferedImage to direct ByteBuffer? Some example code would help a lot.
import com.sun.image.codec.jpeg.JPEGCodec;
import com.sun.image.codec.jpeg.JPEGImageEncoder;
public byte[] toByteArray(BufferedImage image) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
JPEGImageEncoder encoder = JPEGCodec.createJPEGEncoder(baos);
encoder.encode(image);
return baos.toByteArray();
}
From this answer try this. Should be faster.
In a java program I am compressing an InputStream like this:
ChannelBufferOutputStream outputStream = new ChannelBufferOutputStream(ChannelBuffers.dynamicBuffer(BUFFER_SIZE));
GZIPOutputStream compressedOutputStream = new GZIPOutputStream(outputStream);
try {
IOUtils.copy(inputStream, compressedOutputStream);
} finally {
// this should print the byte size after compression
System.out.println(outputStream.writtenBytes());
}
I am testing this code with a json file that is ~31.000 byte uncompressed and ~7.000 byte compressed on disk. Sending a InputStream that is wrapping the uncompressed json file to the code above, outputStream.writtenBytes() returns 10 which would indicate that it compressed down to only 10 byte. That seems wrong, so I wonder where the problem is. ChannelBufferOutputStream javadoc says: Returns the number of written bytes by this stream so far. So it should be working.
Try calling GZIPOutputStream.finish() or flush() methods before counting bytes
If that does not work, you can create a proxy stream, whose mission - to count the number of bytes that have passed through it
I need to optimize a application that uses too much heap memory.
I'm having problem in close a ByteArrayOutputStream variable after using the same. I've tried to do using close() but it does not work. this is the code:
ByteArrayOutputStream zipOutTempStream = new ByteArrayOutputStream();
//arquivo.getZipStream() has the XML received by FTP.
//STreamEtils is the function who transfers the XML to zipOutTempStream
StreamUtils.copiarStream(arquivo.getZipStream(), zipOutTempStream);
//Creating a new XML to write over this.
File arquivo1 = new File("C:/XML.xml");
if (arquivo1.exists()) {
System.out.println("ele existe");
} else {
if (arquivo1.createNewFile()) {
System.out.println("arquivo criado");
} else {
System.out.println("arquivo não criado");
}
}
FileOutputStream arquivo2 = new FileOutputStream(arquivo1);
//Copy the unziped XML to the new xml created.
StreamUtils.copiarStream(StreamUtils .uncompressXmlFromZipStream(new ByteArrayInputStream(zipOutTempStream.toByteArray())), arquivo2);
arquivo.setZipStream(null);
arquivo.setXmlStream(null)
return arquivo;
You cannot close a ByteArrayOutputStream, since it's close() method is documented as
Closing a ByteArrayOutputStream has no effect. The methods in this
class can be called after the stream has been closed without
generating an IOException.
This output stream is backed by an array; it is NOT a buffered stream. If you feel it is using too much memory, you should output bytes directly to some endpoint, such as a file or a socket, using an appropriate OutputStream.
I think you are carelessly using too much memory. close() has nothing to do with it. In fact there is no need for closing ByteArrayOutputStream. Here you are copying the ZIP file into wrapped byte[] array:
ByteArrayOutputStream zipOutTempStream = new ByteArrayOutputStream();
StreamUtils.copiarStream(arquivo.getZipStream(), zipOutTempStream);
and few lines later you convert the byte[] array back to InputStream:
StreamUtils.copiarStream(StreamUtils.uncompressXmlFromZipStream(
new ByteArrayInputStream(zipOutTempStream.toByteArray())
), arquivo2);
Seems like this generated byte[] array is pretty huge (confirm with logging). Instead of storing the whole ZIP file in memory (in byte[]) store in a temporary file and read it back.
Calling Simple toBytes() does produce the bytes but exel throws Warning.
Lost Document information
Googling around gave me this link and looking at Javadocs for worksheet and POI HOW-TO say similar things . Basically I can not get Bytes without loosing some information and should use the write method instead.
While write does work fine I really need to send the bytes over . Is there any way I can do that ? That is get the bytes with out getting any warning .
As that mailing list post said
Invoking HSSFWorkbook.getBytes() does not return all of the data necessary to re-
construct a complete Excel file.
You can use the write method with a ByteArrayOutputStream to get at the byte array.
ByteArrayOutputStream bos = new ByteArrayOutputStream();
try {
workbook.write(bos);
} finally {
bos.close();
}
byte[] bytes = bos.toByteArray();
(The close call is not really needed for a ByteArrayOutputStream, but imho it is good style to include anyway in case its later changed to a different kind of stream.)
How about:
ByteArrayOutputStream baos = new ByteArrayOutputStream();
workbook.write(baos);
byte[] xls = baos.toByteArray();
In order to get a full excel file out, you must call the write(OutputStream) method. If you want bytes from that, just give a ByteArrayOutputStream
I'm updating some old code to grab some binary data from a URL instead of from a database (the data is about to be moved out of the database and will be accessible by HTTP instead). The database API seemed to provide the data as a raw byte array directly, and the code in question wrote this array to a file using a BufferedOutputStream.
I'm not at all familiar with Java, but a bit of googling led me to this code:
URL u = new URL("my-url-string");
URLConnection uc = u.openConnection();
uc.connect();
InputStream in = uc.getInputStream();
ByteArrayOutputStream out = new ByteArrayOutputStream();
final int BUF_SIZE = 1 << 8;
byte[] buffer = new byte[BUF_SIZE];
int bytesRead = -1;
while((bytesRead = in.read(buffer)) > -1) {
out.write(buffer, 0, bytesRead);
}
in.close();
fileBytes = out.toByteArray();
That seems to work most of the time, but I have a problem when the data being copied is large - I'm getting an OutOfMemoryError for data items that worked fine with the old code.
I'm guessing that's because this version of the code has multiple copies of the data in memory at the same time, whereas the original code didn't.
Is there a simple way to grab binary data from a URL and save it in a file without incurring the cost of multiple copies in memory?
Instead of writing the data to a byte array and then dumping it to a file, you can directly write it to a file by replacing the following:
ByteArrayOutputStream out = new ByteArrayOutputStream();
With:
FileOutputStream out = new FileOutputStream("filename");
If you do so, there is no need for the call out.toByteArray() at the end. Just make sure you close the FileOutputStream object when done, like this:
out.close();
See the documentation of FileOutputStream for more details.
I don't know what you mean with "large" data, but try using the JVM parameter
java -Xmx 256m ...
which sets the maximum heap size to 256 MByte (or any value you like).
If you need the Content-Length and your web-server is somewhat standard conforming, then it should provide you a "Content-Length" header.
URLConnection#getContentLength() should give you that information upfront so that you are able to create your file. (Be aware that if your HTTP server is misconfigured or under control of an evil entity, that header may not match the number of bytes received. In that case, why dont you stream to a temp-file first and copy that file later?)
In addition to that: A ByteArrayInputStream is a horrible memory allocator. It always doubles the buffer size, so if you read a 32MB + 1 byte file, then you end up with a 64MB buffer. It might be better to implement a own, smarter byte-array-stream, like this one:
http://source.pentaho.org/pentaho-reporting/engines/classic/trunk/core/source/org/pentaho/reporting/engine/classic/core/util/MemoryByteArrayOutputStream.java
subclassing ByteArrayOutputStream gives you access to the buffer and the number of bytes in it.
But of course, if all you want to do is to store de data into a file, you are better off using a FileOutputStream.