How to read image with IIOImage and Get Raster - java

I am trying to read an image in Java and access the pixels via the raster. However I get an NPE from the Raster, how do I access it?
Here is what I am doing:
public static void main(String[] args) throws Exception
{
IIOImage iioImage = Image.readImage(Main.class.getResourceAsStream("/annalisa-licciardi.png"));
System.out.println(iioImage.getRaster().getHeight());
}
readImage is implemented as follows:
public static IIOImage readImage(ImageInputStream stream) throws IOException
{
if (stream == null)
throw new IllegalArgumentException("stream == null!");
Iterator iterator = ImageIO.getImageReaders(stream);
if (!iterator.hasNext())
return null;
ImageReader imageReader = (ImageReader) iterator.next();
ImageReadParam param = imageReader.getDefaultReadParam();
imageReader.setInput(stream,true,true);
IIOImage iioImage = imageReader.readAll(0,param);
stream.close();
imageReader.dispose();
return iioImage;
}
public static IIOImage readImage(InputStream inputStream) throws IOException
{
return readImage(ImageIO.createImageInputStream(inputStream));
}
How do I get the raster?

ImageReader.readAll(...) doesn't work that way.
Form the JavaDoc:
Reads the image indexed by imageIndex and returns an IIOImage containing the image, thumbnails, and associated image metadata, using a supplied ImageReadParam.
The actual BufferedImage referenced by the returned IIOImage will be chosen using the algorithm defined by the getDestination method.
Also note that an IIOImage can only hold either a BufferedImage or a Raster. Not both. readAll(...) will return an IIOImage that holds a BufferedImage. So, basically, what you are trying to achieve won't work.
But as #Marco13 says in the comments, it's trivial to get the Raster from the BufferedImage once you have loaded it.
BufferedImage image = ImageIO.read(input);
WritableRaster raster = image.getRaster();
To get the pixels as int ARGB values, you don't need the Raster, you can always get it from the BufferedImage directly:
int[] pixels = new int[w * h];
image.getRGB(0, 0, w, h, pixels, 0, w);
These values will be normalized and in the sRGB color space. And it will work, regardless of the actual sample format in the image.
However, if (and only if) your raster (or its backing DataBuffer, really) is already containing pixels in int ARGB (pixel packed) format, you can access them in this way, which is faster (as it requires no conversion):
int[] pixels = ((DataBufferInt) image.getRaster().getDataBuffer()).getData();
In many cases though, images will be in 3 or 4 byte RGB/BGR or RGBA/ABGR (pixel interleaved) form.
You can then get the pixels directly like this:
byte[] pixels = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
You need to loop through the values and convert to int packed form, if that is what you prefer.

Related

Java - Convert RAW to JPG/PNG

I have an image captured by a camera, in RAW BGRA format (byte array).
How can I save it to disk, as a JPG/PNG file?
I've tried with ImageIO.write from Java API, but I got error IllegalArgumentException (image = null)
CODE:
try
{
InputStream input = new ByteArrayInputStream(img);
BufferedImage bImageFromConvert = ImageIO.read(input);
String path = "D:/image.jpg";
ImageIO.write(bImageFromConvert, "jpg", new File(path));
}
catch(Exception ex)
{
System.out.println(ex.toString());
}
Note that "img" is the RAW byte array, and that is NOT null.
The problem is that ImageIO.read does not support raw RGB (or BGRA in your case) pixels. It expects a file format, like BMP, PNG or JPEG, etc.
In your code above, this causes bImageFromConvert to become null, and this is the reason for the error you see.
If you have a byte array in BGRA format, try this:
// You need to know width/height of the image
int width = ...;
int height = ...;
int samplesPerPixel = 4;
int[] bandOffsets = {2, 1, 0, 3}; // BGRA order
byte[] bgraPixelData = new byte[width * height * samplesPerPixel];
DataBuffer buffer = new DataBufferByte(bgraPixelData, bgraPixelData.length);
WritableRaster raster = Raster.createInterleavedRaster(buffer, width, height, samplesPerPixel * width, samplesPerPixel, bandOffsets, null);
ColorModel colorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_sRGB), true, false, Transparency.TRANSLUCENT, DataBuffer.TYPE_BYTE);
BufferedImage image = new BufferedImage(colorModel, raster, colorModel.isAlphaPremultiplied(), null);
System.out.println("image: " + image); // Should print: image: BufferedImage#<hash>: type = 0 ...
ImageIO.write(image, "PNG", new File(path));
Note that JPEG is not a good format for storing images with alpha channel. While it is possible, most software will not display it properly. So I suggest using PNG instead.
Alternatively, you could remove the alpha channel, and use JPEG.
With Matlab you can convert all types of images with 2 lines of code:
img=imread('example.CR2');
imwrite(img,'example.JPG');

Convert OpenCV Mat object to BufferedImage

I am trying to create a helper function using OpenCV Java API that would process an input image and return the output byte array. The input image is a jpg file saved in the computer. The input and output image are displayed in the Java UI using Swing.
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
// Load image from file
Mat rgba = Highgui.imread(filePath);
Imgproc.cvtColor(rgba, rgba, Imgproc.COLOR_RGB2GRAY, 0);
// Convert back to byte[] and return
byte[] return_buff = new byte[(int) (rgba.total() * rgba.channels())];
rgba.get(0, 0, return_buff);
return return_buff;
When the return_buff is returned and converted to BufferedImage I get NULL back. When I comment out the Imgproc.cvtColor function, the return_buff is properly converted to a BufferedImage that I can display. It seems like the Imgproc.cvtColor is returning a Mat object that I couldn't display in Java.
Here's my code to convert from byte[] to BufferedImage:
InputStream in = new ByteArrayInputStream(inputByteArray);
BufferedImage outputImage = ImageIO.read(in);
In above code, outputImage is NULL
Does anybody have any suggestions or ideas?
ImageIO.read(...) (and the javax.imageio package in general) is for reading/writing images from/to file formats. What you have is an array containing "raw" pixels. It's impossible for ImageIO to determine file format from this byte array. Because of this, it will return null.
Instead, you should create a BufferedImage from the bytes directly. I don't know OpenCV that well, but I'm assuming that the result of Imgproc.cvtColor(rgba, rgba, Imgproc.COLOR_RGB2GRAY, 0) will be an image in grayscale (8 bits/sample, 1 sample/pixel). This is the same format as BufferedImage.TYPE_BYTE_GRAY. If this assumption is correct, you should be able to do:
// Read image to Mat as before
Mat rgba = ...;
Imgproc.cvtColor(rgba, rgba, Imgproc.COLOR_RGB2GRAY, 0);
// Create an empty image in matching format
BufferedImage gray = new BufferedImage(rgba.width(), rgba.height(), BufferedImage.TYPE_BYTE_GRAY);
// Get the BufferedImage's backing array and copy the pixels directly into it
byte[] data = ((DataBufferByte) gray.getRaster().getDataBuffer()).getData();
rgba.get(0, 0, data);
Doing it this way, saves you one large byte array allocation and one byte array copy as a bonus. :-)
I used this kind of code to convert Mat object to Buffered Image.
static BufferedImage Mat2BufferedImage(Mat matrix)throws Exception {
MatOfByte mob=new MatOfByte();
Imgcodecs.imencode(".jpg", matrix, mob);
byte ba[]=mob.toArray();
BufferedImage bi=ImageIO.read(new ByteArrayInputStream(ba));
return bi;
}

How to convert Bufferedimage to indexed type and then extract the argb color palette

I need to convert a BufferedImage to a BufferedImage indexed type to extract the indices of the colors data and the 256 color palette.
i think that i am doing right the conversion of a BufferedImage to indexed mode and then extracting the color indices with the next code:
BufferedImage paletteBufferedImage=new BufferedImage(textureInfoSubFile.getWidth(), textureInfoSubFile.getHeight(),BufferedImage.TYPE_BYTE_INDEXED);
paletteBufferedImage.getGraphics().drawImage(originalBufferedImage, 0, 0, null);
// puts the image pixeldata into the ByteBuffer
byte[] pixels = ((DataBufferByte) paletteBufferedImage.getRaster().getDataBuffer()).getData();
My problem now is that i need to know the ARGB values of each color index( the palette) to put them into an array. i have been reading about ColorModel and ColorSpace but i donĀ“t find some methods to do what i need.
I think your code is good (except that you don't "put" any data into anything, you merely reference the data buffer's backing array, meaning changes in pixels will reflect to paletteBufferedImage and vice versa).
To get the ARGB values for the indices in pixels:
IndexColorModel indexedCM = (IndexColorModel) paletteBufferedImage.getColorModel(); // cast is safe for TYPE_BYTE_INDEXED
int[] palette = new int[indexedCM.getMapSize()]; // Allocate array
indexedCM.getRGBs(palette); // Copy palette to array (ARGB values)
For more information, see the IndexColorModel class documentation.
Finally i solve it with this code:
public static BufferedImage rgbaToIndexedBufferedImage(BufferedImage sourceBufferedImage) {
// With this constructor, we create an indexed buffered image with the same dimension and with a default 256 color model
BufferedImage indexedImage = new BufferedImage(sourceBufferedImage.getWidth(), sourceBufferedImage.getHeight(), BufferedImage.TYPE_BYTE_INDEXED);
ColorModel cm = indexedImage.getColorModel();
IndexColorModel icm = (IndexColorModel) cm;
int size = icm.getMapSize();
byte[] reds = new byte[size];
byte[] greens = new byte[size];
byte[] blues = new byte[size];
icm.getReds(reds);
icm.getGreens(greens);
icm.getBlues(blues);
WritableRaster raster = indexedImage.getRaster();
int pixel = raster.getSample(0, 0, 0);
IndexColorModel icm2 = new IndexColorModel(8, size, reds, greens, blues, pixel);
indexedImage = new BufferedImage(icm2, raster, sourceBufferedImage.isAlphaPremultiplied(), null);
indexedImage.getGraphics().drawImage(sourceBufferedImage, 0, 0, null);
return indexedImage;
}

ByteArrayOutput/InputStream on transparent image

I'm trying to send a BufferedImage over socket, I do this by converting the image to byte[] and then send it over after encoding it in Base64. I'm sending over 2 BufferedImages, one of them is "full", the other one is about 50% transparent. The problem I'm having, is that when they arrive, the second image is still visually transparent, but when I get the data array via Raster, it has been changed.
I made a small test code to demonstrate the problem;
BufferedImage levelBufferedOriginal = ...
BufferedImage backgroundBufferedOriginal = ...
byte[] levelDataOriginal = ((DataBufferByte) levelBufferedOriginal.getRaster().getDataBuffer()).getData();
byte[] backgroundDataOriginal = ((DataBufferByte) backgroundBufferedOriginal.getRaster().getDataBuffer()).getData();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
byte[] temp = null, temp2=null;
try {
ImageIO.write(levelBufferedOriginal, "png", baos);
baos.flush();
temp = baos.toByteArray();
baos.close();
baos=new ByteArrayOutputStream();
ImageIO.write(backgroundBufferedOriginal, "png", baos);
baos.flush();
temp2 = baos.toByteArray();
baos.close();
} catch (IOException e1) {
e1.printStackTrace();
}
BufferedImage levelBufferedNew = null;
BufferedImage backgroundBufferedNew = null;
try {
levelBufferedNew = ImageIO.read(new ByteArrayInputStream(temp));
backgroundBufferedNew = ImageIO.read(new ByteArrayInputStream(temp2));
} catch (IOException e) {
e.printStackTrace();
}
byte[] levelDataNew = ((DataBufferByte) levelBufferedNew.getRaster().getDataBuffer()).getData();
byte[] backgroundDataNew = ((DataBufferByte) backgroundBufferedNew.getRaster().getDataBuffer()).getData();
System.out.println("LEVEL: " + Arrays.equals(levelDataOriginal, levelDataNew));
System.out.println("BACKGROUND: " + Arrays.equals(backgroundDataOriginal, backgroundDataNew));
All I do here, is simply transform the BufferedImage to byte[], then back, and compare the data I get from DataBufferByte. The output is
LEVEL: false
BACKGROUND: true
Background is the "full" image, and Level is the one with some transparent pixels.
If the general idea is wrong, I would like to hear another, all I want is to be able to exactly recreate 2 bufferedImages.
edit: What we have established so far:
The images (both before and after) are TYPE_BYTE_INDEXED (13) with IndexColorModel (color map)
The before image has a transparent color in the color map, at index 255 (which is the value -1 in the byte array, as Java uses signed bytes). The after image has a different value at this index, that is not transparent.
The images are serialized/deserialized in PNG format, using ImageIO
The images are visually equal, but the raw pixel data (the byte array) differs
Which leads to the conclusion that the ImageIO PNGImageWriter re-arranges the entries in the color map when writing, resulting in different pixel data/color map.
This basically leaves us with two options:
Serialize the image data in a different way, to assure the color map/pixel data is not modified. It is possible to send the pixel data array, along with the color map array and the height/width of the image, and then re-create the image exactly at the client. This is quite a bit of code, and is probably covered by other questions on SO.
Don't rely on the pixel data/color maps being the same. Use the value of ((IndexColorModel) levelBufferedNew.getColorModel()).getTransparentPixel() to test for/set transparency instead of the hardcoded value -1. This requires pretty much no other change in your code.
Note: These solutions will only work for TYPE_BYTE_INDEXED (13) images.
For a more generic (but possibly slower) approach, use the code in the original answer to set transparent parts, and use (levelBufferedNew.getRGB(x, y) >> 24) == 0 to test for transparency. This should work even for TYPE_INT_ARGB or TYPE_4BYTE_ABGR.
original answer:
Instead of fiddling with the image at byte array level, why not try using normal Java2D? ;-)
Something like:
Graphics2D g = levelBufferedNew.createGraphics();
try {
g.setComposite(AlphaComposite.Clear);
g.fillOval(x, y, w, h); // The area you want to make transparent
}
finally {
g.dispose();
}
...should work.
PS: As the images use IndexColorModel, you can use the getTransparentPixel() to get the transparent pixel index, instead of relying on it being at a certain index (-1/255). Then you can still manipulate at byte array level. ;-)

How can I write 16 bit grayscale image as jpeg?

I have 16-bit per pixel grayscale BufferedImage created from an array of shorts:
private BufferedImage get16bitImage(short[] pixels) {
ColorModel colorModel = new ComponentColorModel(
ColorSpace.getInstance(ColorSpace.CS_GRAY),
new int[]{16},
false,
false,
Transparency.OPAQUE,
DataBuffer.TYPE_USHORT);
DataBufferUShort db = new DataBufferUShort(pixels, pixels.length);
WritableRaster raster = Raster.createInterleavedRaster(
db,
imgD.width,
imgD.height,
imgD.width,
1,
new int[1],
null);
return new BufferedImage(colorModel, raster, false, null);
}
When trying to save it:
ImageIO.write(img, "PNG", new File(resultImgNamePNG)); // works fine
ImageIO.write(img, "BMP", new File(resultImgNameBMP)); // doesn't work, returns false
ImageIO.write(img, "JPEG", new File(resultImgNameJPEG)); // doesnt work, returns false
I tried using JAI:
public void writeImageToJPEG(File out, BufferedImage image, float quality) throws IOException {
JPEGEncodeParam param = new JPEGEncodeParam();
param.setQuality(quality);
ImageEncoder encoder = ImageCodec.createImageEncoder("JPEG", new FileOutputStream(out), param);
encoder.encode(image);
}
encoder.encode(image) throws java.lang.RuntimeException: Only 1, or 3-band byte data may be written.
I think you have to convert it to 8-bit first. If this is used for display purposes, java converts it to 8-bit bit before display anyways.
You can do something that I've seen sometimes actually improve the displayed image which is doing non-linear scaling of the values (using a log scale for example) such detail depends on the image you are generating ofcourse.
More on such effect here: http://www.java.net/external?url=http://www.cs.unm.edu/~brayer/vision/perception.html

Categories

Resources