I have ran into a situation where I am getting out of memory exceptions when trying to convert an input stream into a byte array. On newer Android phones it's no problem, but some of these cheaper models are experiencing it. Here is the method I'm using. Is there a more efficient way to do this?
public byte[] convertStreamToByteArray(InputStream is) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] buff = new byte[1024];
int i = 0;
while ((i = is.read(buff, 0, buff.length)) > 0) {
baos.write(buff, 0, i);
}
return baos.toByteArray();
}
There is, do not copy all the data from the stream to memory.
Why do you need to do so ?
Can't you read and consume the stream data as it comes without holding the whole data in memory ?
If you need some buffering, buffer as small as you can for it to be consumable/processed and throw that buffer away and create a new one so that you have as less memory used at a time as possible.
See if IOUtils can help you out here.
Import it
import org.apache.commons.io.IOUtils;
Then try:
public byte[] convertStreamToByteArray(InputStream is) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
IOUtils.copy(is, baos);
return baos.toByteArray();
}
Try to create an byte array with the size of InputStream like this :
public byte[] convertStreamToByteArray(InputStream is) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] buff = new byte[is.available()];
int i = 0;
while ((i = is.read(buff, 0, buff.length)) > 0) {
baos.write(buff, 0, i);
}
return baos.toByteArray();
}
Related
I am trying to compress and array of bytes into another array of bytes using GZIPOutputStream (in Java).
This is my code:
#Test
public void testCompressBytes() throws IOException {
final byte[] uncompressed = RandomStringUtils.randomAlphanumeric(100000 /* 100 kb */).getBytes();
// compress
byte[] compressed;
try (InputStream is = new ByteArrayInputStream(uncompressed);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
OutputStream os = new GZIPOutputStream(baos)) {
IOUtils.copy(is, os); // org.apache.commons.io
os.flush();
compressed = baos.toByteArray();
}
System.out.println("Size before compression = " + uncompressed.length + ", after = " + compressed.length);
// decompress back
byte[] decompressedBack;
try (InputStream is = new GZIPInputStream(new ByteArrayInputStream(compressed));
ByteArrayOutputStream baos = new ByteArrayOutputStream()) {
IOUtils.copy(is, baos); // EXCEPTION THROWN HERE
baos.flush();
decompressedBack = baos.toByteArray();
}
assertArrayEquals(uncompressed, decompressedBack);
}
And this is the output I'm getting:
Size before compression = 100000, after = 63920
java.io.EOFException: Unexpected end of ZLIB input stream
What could I be doing wrong?
You need to call GZIPOutputStream::close before calling ByteArrayOutputStream::toByteArray, so that GZIPOutputStream writes all the end bits.
In your current code you are calling ByteArrayOutputStream::toByteArray before GZIPOutputStream::close (via try-with-resources) that's why it doesn't work.
Thanks, everybody!
Although calling GZIPOutputStream::finish() before ByteArrayOutputStream::toByteArray() seems to do the trick, I believe it's better to completely close the GZIP stream first, which in turn forces us to keep ByteArrayOutputStream outside the try-with-resources clause.
So, my reworked compression part looks like that now:
final ByteArrayOutputStream baos = new ByteArrayOutputStream();
try (InputStream is = new ByteArrayInputStream(uncompressed);
GZIPOutputStream gzos = new GZIPOutputStream(baos)) {
IOUtils.copy(is, gzos);
} catch (final IOException e) {
throw new RuntimeException(e);
}
IOUtils.closeQuietly(baos);
final byte[] compressed = baos.toByteArray();
Before decompressing the array compressed by GZIP, I would like to know the length of the original array so as to apply for a suitable buffer array.Like "1024" in the following code.Thanks!
byte[] buffer = new byte[1024];
ByteArrayOutputStream out = new ByteArrayOutputStream(1024);
try (ByteArrayInputStream bin = new ByteArrayInputStream(localByteBuf);
GZIPInputStream gis = new GZIPInputStream(bin)) {
int len;
while ((len = gis.read(buffer)) > 0) {
out.write(buffer, 0, len);
}
} catch (IOException e) {
e.printStackTrace();
}
ByteArrayOutputStream will take care of growing the internal byte[] array. As per the class javadoc:
This class implements an output stream in which the data is
written into a byte array. The buffer automatically grows as data
is written to it.
You most likely want to copy the stream using InputStream.transferTo() to make the code more readable:
ByteArrayOutputStream out = new ByteArrayOutputStream(); // default 32 size
GZIPInputStream in = new GZIPInputStream(
new ByteArrayInputStream(localByteBuf)));
in.transferTo(out);
I'm trying to download some images provided by a hoster. This is the method I use:
public static void downloadImage(String imageLink, File f) throws IOException
{
URL url = new URL(imageLink);
byte[] buffer = new byte[1024];
BufferedInputStream in = new BufferedInputStream(url.openStream(), buffer.length);
BufferedOutputStream out = new BufferedOutputStream(new FileOutputStream(f), buffer.length);
while (in.read(buffer) > 0)
out.write(buffer);
out.flush();
out.close();
in.close();
}
However, the file turn out too big. 5MB for a 80x60 jpg is too much in my opinion.
What could be the cause of this?
You are doing things wrong here: read() returns the number of bytes that were really read; thus you have to write exactly that number from your buffer array into your output stream.
Your code is corrupting your output; and simply writing out a buffer array ... that mostly consists of 0s!
Instead do something like:
int bytesRead;
while ( ( bytesRead = in.read(buffer)) > 0) {
byte outBuffer[] = new byte[bytesRead];
... then use arraycopy to move bytesRead bytes
out.write(outBuffer);
}
( this is meant as inspiration to get you going, more pseudo like than real code )
i'm trying to convert audio(mp3/wav etc.) to byte array. i did it using inputStream to byte array conversion.
the problem is that after few hundred samples i recieve only zeroes.
at first i thought the problem was with the file so i tried debugging with another file and had the same problem.
I thought the problem was with the code so i tried using IOUtils and got the exact same resualts.
can anyone tell me what i'm doing wrong?
the code i used:
File file = new File(path);
final InputStream inputStream = new FileInputStream(file);
byte[] byteSamples = inputStreamToByteArray(inputStream);
public byte[] inputStreamToByteArray(InputStream inStream) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] buffer = new byte[8192];
int bytesRead;
while ((bytesRead = inStream.read(buffer)) > 0) {
baos.write(buffer, 0, bytesRead);
}
return baos.toByteArray();
}
the alternate version using IOUtils:
byte[] byteSamples = IOUtils.toByteArray(inputStream);
update : i tried doing it using BufferedInputStream, still the exact same results.
byte[] byteSamples = new byte[(int)file.length()];
try {
BufferedInputStream buf = new BufferedInputStream(new FileInputStream(file));
buf.read(byteSamples, 0, byteSamples.length);
buf.close();
} catch (FileNotFoundException e) {
e.printStackTrace();}
You need to close the streams when done.
I am experiencing a memory problem that I do not understand.
I have the following case
Case 1
public byte[] getBytes(InputStream is) throws IOException {
int len;
int size = 1024;
byte[] buf;
ByteArrayOutputStream bos = new ByteArrayOutputStream();
buf = new byte[size];
while ((len = is.read(buf, 0, size)) != -1)
{
bos.write(buf, 0, len);
}
buf = bos.toByteArray();
return buf;
}
Public void dosomething()
{
//instructions
InputStream is = new ByteArrayInputStream(getBytes(bodyPart.getInputStream()));
}
Work fine without error
but this
Case 2
Public void dosomething()
{
//instructions
ByteArrayOutputStream bos = new ByteArrayOutputStream();
int len;
int size = 1024;
byte[] bufferFichierEntree = new byte[size];
while ((len = bodyPart.getInputStream().read(bufferFichierEntree, 0, size)) != -1)
{
bos.write(bufferFichierEntree, 0, len);
}
InputStream is = new ByteArrayInputStream(bufferFichierEntree);
}
return a java.lang.OutOfMemoryError: Java heap space and a don t know why ?
The only difference is that in the first case i use a function unlike the second case
In
while ((len=bodyPart.getInputStream().read(bufferFichierEntree, 0, size)) != -1)
you are creating new InputStream in each loop, so you are reading only first bytes in each loop.
Try to create input stream before while and use it the way you did in your first example.
The reason for it might be:
When you are using two methods, the ByteArrayOutputStream goes out of scope and can be cleaned up by the Garbage Collector (GC).
Using only one method, you the buffer can't be cleaned up by GC since it is still in the scope, unless you nullify it.
Of course, the other reason might be that you are creating each time a new InputStream inside your loop, since we don't know exactly what bodyPart.getInputStream() does. If this is the case, solve it like this:
InputStream in = bodyPart.getInputStream();
while ((len = in.read(bufferFichierEntree, 0, size)) != -1)
{
bos.write(bufferFichierEntree, 0, len);
}
seems like a scoping problem with the getInputStream, in the first example you are using 1 InputStream, in the second one infinity,and reading the first 1000 bytes every time. if you change it to like;
InputStream is = bodyPart.getInputStream();
while ((len = is.read(bufferFichierEntree, 0, size)) != -1) {
it should run as expected.