ByteArrayOutput/InputStream on transparent image - java

I'm trying to send a BufferedImage over socket, I do this by converting the image to byte[] and then send it over after encoding it in Base64. I'm sending over 2 BufferedImages, one of them is "full", the other one is about 50% transparent. The problem I'm having, is that when they arrive, the second image is still visually transparent, but when I get the data array via Raster, it has been changed.
I made a small test code to demonstrate the problem;
BufferedImage levelBufferedOriginal = ...
BufferedImage backgroundBufferedOriginal = ...
byte[] levelDataOriginal = ((DataBufferByte) levelBufferedOriginal.getRaster().getDataBuffer()).getData();
byte[] backgroundDataOriginal = ((DataBufferByte) backgroundBufferedOriginal.getRaster().getDataBuffer()).getData();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] temp = null, temp2=null;
try {
ImageIO.write(levelBufferedOriginal, "png", baos);
baos.flush();
temp = baos.toByteArray();
baos.close();
baos=new ByteArrayOutputStream();
ImageIO.write(backgroundBufferedOriginal, "png", baos);
baos.flush();
temp2 = baos.toByteArray();
baos.close();
} catch (IOException e1) {
e1.printStackTrace();
}
BufferedImage levelBufferedNew = null;
BufferedImage backgroundBufferedNew = null;
try {
levelBufferedNew = ImageIO.read(new ByteArrayInputStream(temp));
backgroundBufferedNew = ImageIO.read(new ByteArrayInputStream(temp2));
} catch (IOException e) {
e.printStackTrace();
}
byte[] levelDataNew = ((DataBufferByte) levelBufferedNew.getRaster().getDataBuffer()).getData();
byte[] backgroundDataNew = ((DataBufferByte) backgroundBufferedNew.getRaster().getDataBuffer()).getData();
System.out.println("LEVEL: " + Arrays.equals(levelDataOriginal, levelDataNew));
System.out.println("BACKGROUND: " + Arrays.equals(backgroundDataOriginal, backgroundDataNew));
All I do here, is simply transform the BufferedImage to byte[], then back, and compare the data I get from DataBufferByte. The output is
LEVEL: false
BACKGROUND: true
Background is the "full" image, and Level is the one with some transparent pixels.
If the general idea is wrong, I would like to hear another, all I want is to be able to exactly recreate 2 bufferedImages.

edit: What we have established so far:
The images (both before and after) are TYPE_BYTE_INDEXED (13) with IndexColorModel (color map)
The before image has a transparent color in the color map, at index 255 (which is the value -1 in the byte array, as Java uses signed bytes). The after image has a different value at this index, that is not transparent.
The images are serialized/deserialized in PNG format, using ImageIO
The images are visually equal, but the raw pixel data (the byte array) differs
Which leads to the conclusion that the ImageIO PNGImageWriter re-arranges the entries in the color map when writing, resulting in different pixel data/color map.
This basically leaves us with two options:
Serialize the image data in a different way, to assure the color map/pixel data is not modified. It is possible to send the pixel data array, along with the color map array and the height/width of the image, and then re-create the image exactly at the client. This is quite a bit of code, and is probably covered by other questions on SO.
Don't rely on the pixel data/color maps being the same. Use the value of ((IndexColorModel) levelBufferedNew.getColorModel()).getTransparentPixel() to test for/set transparency instead of the hardcoded value -1. This requires pretty much no other change in your code.
Note: These solutions will only work for TYPE_BYTE_INDEXED (13) images.
For a more generic (but possibly slower) approach, use the code in the original answer to set transparent parts, and use (levelBufferedNew.getRGB(x, y) >> 24) == 0 to test for transparency. This should work even for TYPE_INT_ARGB or TYPE_4BYTE_ABGR.
original answer:
Instead of fiddling with the image at byte array level, why not try using normal Java2D? ;-)
Something like:
Graphics2D g = levelBufferedNew.createGraphics();
try {
g.setComposite(AlphaComposite.Clear);
g.fillOval(x, y, w, h); // The area you want to make transparent
}
finally {
g.dispose();
}
...should work.
PS: As the images use IndexColorModel, you can use the getTransparentPixel() to get the transparent pixel index, instead of relying on it being at a certain index (-1/255). Then you can still manipulate at byte array level. ;-)

Related

From image to BufferedImage to image

I'm trying to read a 256x256 image using ImageIO.read, transform it into a ByteArray, then into a BufferedImage, and back into an image file using ImageIO.write. It all seems to work as it should, but the final image is quite corrupted (although clearly still based on the original image. I can't find what's wrong in the process, I am suspicious of the scansize parameter, which I don't completely understand.
The idea is to manipulate pixels in between the reading and writing, but at the moment I can't even recreate the original image back into itself.
I attach the original image and the processed one below:
import java.awt.image.BufferedImage
import java.io.ByteArrayOutputStream
import java.io.File
import java.io.IOException
import javax.imageio.ImageIO
fun main(args: Array<String>) {
val bImage = ImageIO.read(File("original.tiff"))
val bos = ByteArrayOutputStream()
ImageIO.write(bImage, "tiff", bos)
val data = bos.toByteArray()
val width = 256
val height = 256
val bytesPerPixel = 3
val len = width * height * bytesPerPixel
val image = BufferedImage(width, height, BufferedImage.TYPE_INT_RGB)
val arr = IntArray(len)
for (i in 0 until len) arr[i] = data.get(i).toInt()
image.setRGB(0, 0, width, height, arr, 0, 256) // Seems like something is wrong here
try {
ImageIO.write(image, "jpg", File("converted-grayscale-002.jpg"))
} catch (e: IOException) {
System.err.println("IOException: $e")
}
}
This line is not returning the RGB image data:
val data = bos.toByteArray()
it is returning the compressed stream of the image in tiff format, surely it is not the correct image.
To get the pixels, use Image.getPixel(), alternatively you can get the buffer of the image directly, but to this you need to know what is the underlying buffer type - this is broad topic.
Take a look at this answer, it should give you idea how to do it: convert a RGB image to grayscale Image reducing the memory in java

Compressing a multi-page tiff image with lossy jpeg

I need to compress a tif file that has several gray 16bit images (multi-page). I have tried working with ImageIO as here: Tiff compression using Java ImageIO Initially, each image that will be in the tif file comes from another tiff file. When I want to use the compressors, I have the following options:
CCITT RLE, CCITT T.4, CCITT T.6: They give me the error: "javax.imageio.IIOException: I/O error writing TIFF file!"
LZW. I cannot use it. My images are 16bit and LZW increases the size of 16bit images
JPEG. Not possible for 16bit images.
ZLIB. It only reduces 10% even if I specify setCompressionQuality(0.0f);
PackBits. Does not compress.
Deflate. Like ZLIB.
EXIF JPEG. It gives me the error: "javax.imageio.IIOException: Old JPEG compression not supported!"
Does any know any other alternative? I saw an apache imaging library but the tif compression only support the above or less options. Does anyone know about JPEG2000 compressor? Any other kind of alternative?
PNG compresses 16-bit images losslessly. Libraries and utilities are widely available. JPEG2000 has a lossy 16-bit mode, but you'd have to find some software that supports it. Open JPEG might.
However I'd have to ask: what are your criteria for when you have acceptable image quality and when you do not? If it is visual, then you likely end up at normal JPEG anyway, with a good bit less than 8 bits per pixel effective.
Reducing the image 16 bit to 8 bit. Consider that you have a byte[] variable plane16 where you have all the pixels of your image.
Note: My byte[] plane16 gets the data from a 16bit image but byte is 8bit=1byte. Therefore, 2 elements in row of this array are 2byte = 16bit. That is why I convert it to a short[] before operating. If you start from a short[], ommit "ByteBuffer.wrap(plane16).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(shorts);"
byte[] plane16; //Fill it with your image!!!
//Do stuff with imageIO. Set writer and compresion method
ImageIO.scanForPlugins();
TIFFImageWriterSpi tiffspi = new TIFFImageWriterSpi();
javax.imageio.ImageWriter writerIO = tiffspi.createWriterInstance();
ImageWriteParam param = writerIO.getDefaultWriteParam();
param.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
param.setCompressionType("ZLib");
param.setCompressionQuality(0.5f);
File fOutputFile = new File(route+".tif");
ImageOutputStream ios = ImageIO.createImageOutputStream(fOutputFile);
writerIO.setOutput(ios);
//Reducing 16bit to 8bit
short[] shorts = new short[plane16.length/2];
ByteBuffer.wrap(plane16).order(ByteOrder.LITTLE_ENDIAN).asShortBuffer().get(shorts);
int max = 0;
int min = 999999;
for (int v = 0; v<shorts.length;v++){
if (max<shorts[v]) max = shorts[v];
if (min>shorts[v]) min = shorts[v];
}
double range = 255./(max-min);
byte[] plane8 = new byte[shorts.length];
for (int v = 0; v<plane8.length;v++){
plane8[v] = (byte) ((shorts[v]+min)*range - 128);
}
//16bit:
/*BufferedImage convertedGrayscale = new BufferedImage(width,
heigth, BufferedImage.TYPE_USHORT_GRAY);
convertedGrayscale.getRaster().setDataElements(0, 0, width,
heigth, shorts);*/
//8bit:
BufferedImage convertedGrayscale = new BufferedImage(width,
heigth, BufferedImage.TYPE_BYTE_GRAY);
convertedGrayscale.getRaster().setDataElements(0, 0, width,
heigth, plane8);
//Save image
//If you have a stack of images in tiff, do this trick. "image" is the image number you are setting inside the tiff. If you only have 1 image, remove the if and take the expression from the else.
if (image!=0){
writerIO.writeInsert(image, new IIOImage(convertedGrayscale, null, null), param);
}else{
writerIO.write(null, new IIOImage(convertedGrayscale, null, null), param);
}
//do the next only after the last image to be saved
writerIO.dispose();
ios.flush();
ios.close();

Convert OpenCV Mat object to BufferedImage

I am trying to create a helper function using OpenCV Java API that would process an input image and return the output byte array. The input image is a jpg file saved in the computer. The input and output image are displayed in the Java UI using Swing.
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
// Load image from file
Mat rgba = Highgui.imread(filePath);
Imgproc.cvtColor(rgba, rgba, Imgproc.COLOR_RGB2GRAY, 0);
// Convert back to byte[] and return
byte[] return_buff = new byte[(int) (rgba.total() * rgba.channels())];
rgba.get(0, 0, return_buff);
return return_buff;
When the return_buff is returned and converted to BufferedImage I get NULL back. When I comment out the Imgproc.cvtColor function, the return_buff is properly converted to a BufferedImage that I can display. It seems like the Imgproc.cvtColor is returning a Mat object that I couldn't display in Java.
Here's my code to convert from byte[] to BufferedImage:
InputStream in = new ByteArrayInputStream(inputByteArray);
BufferedImage outputImage = ImageIO.read(in);
In above code, outputImage is NULL
Does anybody have any suggestions or ideas?
ImageIO.read(...) (and the javax.imageio package in general) is for reading/writing images from/to file formats. What you have is an array containing "raw" pixels. It's impossible for ImageIO to determine file format from this byte array. Because of this, it will return null.
Instead, you should create a BufferedImage from the bytes directly. I don't know OpenCV that well, but I'm assuming that the result of Imgproc.cvtColor(rgba, rgba, Imgproc.COLOR_RGB2GRAY, 0) will be an image in grayscale (8 bits/sample, 1 sample/pixel). This is the same format as BufferedImage.TYPE_BYTE_GRAY. If this assumption is correct, you should be able to do:
// Read image to Mat as before
Mat rgba = ...;
Imgproc.cvtColor(rgba, rgba, Imgproc.COLOR_RGB2GRAY, 0);
// Create an empty image in matching format
BufferedImage gray = new BufferedImage(rgba.width(), rgba.height(), BufferedImage.TYPE_BYTE_GRAY);
// Get the BufferedImage's backing array and copy the pixels directly into it
byte[] data = ((DataBufferByte) gray.getRaster().getDataBuffer()).getData();
rgba.get(0, 0, data);
Doing it this way, saves you one large byte array allocation and one byte array copy as a bonus. :-)
I used this kind of code to convert Mat object to Buffered Image.
static BufferedImage Mat2BufferedImage(Mat matrix)throws Exception {
MatOfByte mob=new MatOfByte();
Imgcodecs.imencode(".jpg", matrix, mob);
byte ba[]=mob.toArray();
BufferedImage bi=ImageIO.read(new ByteArrayInputStream(ba));
return bi;
}

Faster way for Bitmap to Byte[] coversion

I am not new to bitmaps nor new to java. I am trying to convert High resolution bitmaps to byte array in a loop. Please find code here:
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 50, stream);
imageByteArray = stream.toByteArray();
When i am using the above approach I cam able to convert 5 images in 1 second. But I need it to be even faster. I tried ByteBuffer approach also like this:
Bitmap bmp = intent.getExtras().get("data");
int size = bmp.getRowBytes() * bmp.getHeight();
ByteBuffer b = ByteBuffer.allocate(size);
bmp.copyPixelsToBuffer(b);
byte[] bytes = new byte[size];
try {
b.get(bytes, 0, bytes.length);
} catch (BufferUnderflowException e) {
// always happens
}
But this is very slow (Slower then previous) :(
Please, can somebody give a faster method? Guide Me...
The first solution is the right one.
But two things can happen here:
The image is maybe not of JPEG type, so conversion occurs, which takes time
The image is compressed 50%, which takes time
That aside, if it's taking some time, I doubt it could go faster (being the right solution).
You must consider the fact that the speed of processing is tightly tied to the speed of the device you are testing on( since this is tagged android I'm presuming you're using a mobile device ).
You should take a look at android developer on how to handle large bitmaps effectively Android developers . Since processing 5 high resolution images per second is slow to you I can presume you are having some kind of gallery or previews? If that's the case you shouldn't handle the high resolution images and should indeed take a look at the link above.
Also as a side-note your second code can be optimised this way:
int bytes = bmp.getByteCount();
ByteBuffer buffer = ByteBuffer.allocate(bytes);
bmp.copyPixelsToBuffer(buffer);
byte[] array = buffer.array();
Otherwise the most efficient way of copying bytes that I know is copy() taken from Commons-IO:
public static int copy(InputStream input, OutputStream output) throws IOException {
int n, count = 0;
byte[] buffer = new byte[4 * 1024];
while (-1 != (n = input.read(buffer))) {
output.write(buffer, 0, n);
count += n;
}
return count;
}
you can try as follows
Bitmap bitmap = intent.getExtras().get("data");
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bitmap .compress(Bitmap.CompressFormat.JPEG, 100, stream);
byte[] byteArray = stream.toByteArray();
hope it may work good for you!!
check for line bitmap.compress(Bitmap.CompressFormat.JPEG, 50, stream); it may cause problem..as you are using JPEG format with rate 50.

Java: BufferedImage to byte array and back

I see that a number of people have had a similar problem, however I'm yet to try find exactly what I'm looking for.
So, I have a method which reads an input image and converts it to a byte array:
File imgPath = new File(ImageName);
BufferedImage bufferedImage = ImageIO.read(imgPath);
WritableRaster raster = bufferedImage .getRaster();
DataBufferByte data = (DataBufferByte) raster.getDataBuffer();
What I now want to do is convert it back into a BufferedImage (I have an application for which I need this functionality). Note that "test" is the byte array.
BufferedImage img = ImageIO.read(new ByteArrayInputStream(test));
File outputfile = new File("src/image.jpg");
ImageIO.write(img,"jpg",outputfile);
However, this returns the following exception:
Exception in thread "main" java.lang.IllegalArgumentException: im == null!
This is because the BufferedImage img is null. I think this has something to do with the fact that in my original conversion from BufferedImage to byte array, information is changed/lost so that the data can no longer be recognised as a jpg.
Does anyone have any suggestions on how to solve this? Would be greatly appreciated.
This is recommended to convert to a byte array
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ImageIO.write(img, "jpg", baos);
byte[] bytes = baos.toByteArray();
Note that calling close or flush will do nothing, you can see this for yourself by looking at their source/doc:
Closing a ByteArrayOutputStream has no effect.
The flush method of OutputStream does nothing.
Thus use something like this:
ByteArrayOutputStream baos = new ByteArrayOutputStream(THINK_ABOUT_SIZE_HINT);
boolean foundWriter = ImageIO.write(bufferedImage, "jpg", baos);
assert foundWriter; // Not sure about this... with jpg it may work but other formats ?
byte[] bytes = baos.toByteArray();
Here are a few links concerning the size hint:
Java: Memory efficient ByteArrayOutputStream
jpg bits per pixel
Of course always read the source code and docs of the version you are using, do not rely blindly on SO answers.

Categories

Resources