How do I convert a javafx.scene.image.Image to a byte array in the format bgra?
I tried doing:
PixelReader pixelReader = img.getPixelReader();
int width = (int)img.getWidth();
int height = (int)img.getHeight();
byte[] buffer = new byte[width * height * 4];
pixelReader.getPixels(
0,
0,
width,
height,
PixelFormat.getByteBgraInstance(),
buffer,
0,
width
);
but it didn't work, my byte[] array buffer is still filled with zeros.
The scanlineStride i. e. width must be multiplied by 4, i. e.
PixelReader pixelReader = img.getPixelReader();
int width = (int)img.getWidth();
int height = (int)img.getHeight();
byte[] buffer = new byte[width * height * 4];
pixelReader.getPixels(
0,
0,
width,
height,
PixelFormat.getByteBgraInstance(),
buffer,
0,
width * 4
);
Related
I need a fast way to convert a JavaFX Image to an byte array.
The way with "BufferedImage bImage = SwingFXUtils.fromFXImage(i, null);" is to slow.
I thinks its better to not convert the Image first to the awt.BufferedImage.
So what I have so far is:
PixelReader pr = img.getPixelReader();
WritablePixelFormat<ByteBuffer> pixelformat = WritablePixelFormat.getByteBgraInstance();
int w = (int) img.getWidth();
int h = (int) img.getHeight();
int offset = 0;
int scanlineStride = w * 4;
byte[] buffer = new byte[w * h * 4];
pr.getPixels(0, 0, w, h, pixelformat, buffer, offset, scanlineStride);
But this is not working as excepted.
Seems like the byte[] is empty or so?
you can use toByteArray() method of IOUtils class from org.apache.commons package. see here
I use lwjgl to render Images in OpenGL, now i want to store the content of the Framebuffer as RGB in an OpenCV Matrix. To make shure everything runs fine, im showing the captured image on Panel of a jFrame.
But heres the problem: While showing stored jpegs everything looks fine but if im trying to show the captured Framebuffer i only see stripes!
Here is the code for a screenshot:
public Mat takeMatScreenshot()
{
int width = m_iResolutionX;
int height = m_iResolutionY;
int pixelCount = width * height;
byte[] pixelValues = new byte[ pixelCount * 3 ];
ByteBuffer pixelBuffer = BufferUtils.createByteBuffer( width * height * 3 );
glBindFramebuffer( GL_FRAMEBUFFER, m_iFramebuffer );
glReadPixels( 0, 0, width, height, GL_RGB, GL_UNSIGNED_BYTE, pixelBuffer );
for( int i = 0; i < pixelCount; i++ )
{
int line = height - 1 - (i / width); // flipping the image upside down
int column = i % width;
int bufferIndex = ( line * width + column ) * 3;
pixelValues[bufferIndex + 0 ] = (byte)(pixelBuffer.get(bufferIndex + 0) & 0xFF) ;
pixelValues[bufferIndex + 1 ] = (byte)(pixelBuffer.get(bufferIndex + 1) & 0xFF);
pixelValues[bufferIndex + 2 ] = (byte)(pixelBuffer.get(bufferIndex + 2) & 0xFF);
}
Mat image = new Mat(width, height, CvType.CV_8UC3);
image.put(0, 0, pixelValues);
new ImageFrame(image);
return image;
}
And here the code for displaying a Mat:
public static Image toBufferedImage(Mat m)
{
int type = BufferedImage.TYPE_BYTE_GRAY;
if ( m.channels() == 3 )
type = BufferedImage.TYPE_3BYTE_BGR;
if( m.channels() == 4 )
type = BufferedImage.TYPE_4BYTE_ABGR;
int bufferSize = m.channels()*m.cols()*m.rows();
byte [] b = new byte[bufferSize];
m.get( 0, 0, b ); // get all the pixels
BufferedImage image = new BufferedImage( m.cols(), m.rows(), type );
final byte[] targetPixels = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
System.arraycopy(b, 0, targetPixels, 0, b.length);
return image;
}
It would be great if anyoune could help me!
Cheers!
Oh no! Facepalm!
The constructor of a OpenCV Mat Object is: Mat(rows, cols)!
So the right solution is:
Mat image = new Mat(height, width, CvType.CV_8UC3);
image.put(0, 0, pixelValues);
I'm trying to binarize an RGB image using the adaptiveThreshold method. My code is as follow:
public byte[] filter(byte[] buff, int width, int height) {
Mat img_rgb = new Mat(width, height, CvType.CV_8UC3);
Mat img_gray = new Mat(width, height, CvType.CV_8U);
Mat img_bin = new Mat(width, height, CvType.CV_8U);
img_rgb.put(0, 0, buff);
Imgproc.cvtColor(img_rgb, img_gray, Imgproc.COLOR_RGB2GRAY);
Imgproc.adaptiveThreshold(img_gray, img_bin, 255,
Imgproc.ADAPTIVE_THRESH_GAUSSIAN_C, Imgproc.THRESH_BINARY, 5, 2);
int size = (int) img_bin.total() * img_bin.channels();
byte[] bin_buff = new byte[size];
img_bin.get(0, 0, bin_buff);
return bin_buff;
}
The max value of the img_bin data after apply the adaptiveThreshold should be 255, but it is -1 instead. Why is this happening? I'm new to OpenCV and i can't find any explanation.
Thanks in advance.
The two's complement representation of −1 in a signed byte is equal to that of 255 in an unsigned byte.It may be that you are missing a cast somewhere in your code.
What in Java matches to C# PixelFormat's members.
i.e. Format24bppRgb matches to BufferedImage.TYPE_INT_RGB?
Here is my code. I got an image which has .Net's PixelFormat = Format32bppArgb. I am creating BufferedImage like this:
int sizeBytes = width * height;
DataBufferByte dataBuffer = new DataBufferByte(myImageBytes, sizeBytes);
WritableRaster raster = Raster.createInterleavedRaster(dataBuffer, // dataBuffer
width, // width
height, // height
width * 4, // scanlineStride
4, // pixelStride
new int[]{0, 1, 2, 3}, // bandOffsets
null); // location
java.awt.image.ColorModel colorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_sRGB), // ColorSpace
new int[]{8, 8, 8, 8}, // bits
true, // hasAlpha
false, // isPreMultiplied
ComponentColorModel.TRANSLUCENT, DataBuffer.TYPE_BYTE);
BufferedImage result = new BufferedImage(colorModel, raster, false, null);
After I create a bufferedImage red and blue colors are swapped in it.
Next, I tried to create an image as follow
BufferedImage result = new BufferedImage(width, height, BufferedImage.TYPE_4BYTE_ABGR);
WritableRaster r = result.getRaster();
int[] pixels = byteToInt(bytes);
r.setPixels(0, 0, width, height , pixels); // ! Here an exception occures, because after I converted the byte array to int one the width becomes too long.
Byte array was convert by this method
private int[] byteToInt(byte[] pixels) {
int[] ints = new int[pixels.length / 3];
int byteIdx = 0;
for (int pixel = 0; pixel < ints.length; pixel++) {
int red = (int) pixels[byteIdx++] & 0xFF;
int green = (int) pixels[byteIdx++] & 0xFF;
int blue = (int) pixels[byteIdx++] & 0xFF;
int rgb = (red << 16) | (green << 8) | blue;
ints[pixel] = rgb;
}
return ints;
}
The colors look fine now, but I got the exception
java.lang.ArrayIndexOutOfBoundsException: 27600
at sun.awt.image.ByteInterleavedRaster.setPixels(ByteInterleavedRaster.java:1106)
If I use the smaller width (e.g. width / 3) the colors look fine but the picture itself shrinks.
BufferedImage is definately a good place to start. Many of the values in PixelFormat will match up to values in BufferedImage - they each have 24-bit and 32-bit RGB/ARGB values, both have 5-5-5 and 5-6-5 combinations, etc.
If you're having trouble, post some code and we'll have a look at it, try to help. The thing I would recommend would be to play around with the byte ordering (in the pixel ints) until you get the result that you expect - try drawing the BufferedImage onto a GUI object like JPanel so you can see what it looks like.
If you've got an array of int[] for your pixel values, this is the code I usually use for displaying the array as an image...
int[] pixels;
ColorModel model = new DirectColorModel(32,0x00ff0000,0x0000ff00,0x000000ff,0xff000000);
Image image = new JLabel().createImage(new MemoryImageSource(width,height,model,pixels,0,width));
What is the right way to convert raw array of bytes into Image in Java SE.
array consist of bytes, where each three bytes represent one pixel, with each byte for corresponding RGB component.
Can anybody suggest a code sample?
Thanks,
Mike
You can do it using Raster class. It's better because it does not require iterating and copying of byte arrays.
byte[] raw = new byte[width*height*3]; // raw bytes of our image
DataBuffer buffer = new DataBufferByte(raw, raw.length);
//The most difficult part of awt api for me to learn
SampleModel sampleModel = new ComponentSampleModel(DataBuffer.TYPE_BYTE, width, height, 3, width*3, new int[]{2,1,0});
Raster raster = Raster.createRaster(sampleModel, buffer, null);
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_3BYTE_BGR);
image.setData(raster);
Assuming you know the height and width of the image.
BufferedImage img=new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);
for(int r=0; r<height; r++)
for(int c=0; c<width; c++)
{
int index=r*width+c;
int red=colors[index] & 0xFF;
int green=colors[index+1] & 0xFF;
int blue=colors[index+2] & 0xFF;
int rgb = (red << 16) | (green << 8) | blue;
img.setRGB(c, r, rgb);
}
Roughly. This assumes the pixel data is encoded as a set of rows; and that the length of colors is 3 * width * height (which should be valid).
folkyatina's approach works if your RGB values are in B,G,R order, but if they are in R,G,B order I have found the following code to work:
DataBuffer rgbData = new DataBufferByte(rgbs, rgbs.length);
WritableRaster raster = Raster.createInterleavedRaster(
rgbData, width, height,
width * 3, // scanlineStride
3, // pixelStride
new int[]{0, 1, 2}, // bandOffsets
null);
ColorModel colorModel = new ComponentColorModel(
ColorSpace.getInstance(ColorSpace.CS_sRGB),
new int[]{8, 8, 8}, // bits
false, // hasAlpha
false, // isPreMultiplied
ComponentColorModel.OPAQUE,
DataBuffer.TYPE_BYTE);
return new BufferedImage(colorModel, raster, false, null);
There is a setRGB variant which accepts an int array of RGBA values:
BufferedImage img=new BufferedImage(width, height, BufferedImage.TYPE_INT_RGB);
int[] raw = new int[data.length * 4 / 3];
for (int i = 0; i < data.length / 3; i++) {
raw[i] = 0xFF000000 |
((data[3 * i + 0] & 0xFF) << 16) |
((data[3 * i + 1] & 0xFF) << 8) |
((data[3 * i + 2] & 0xFF));
}
img.setRGB(0, 0, width, height, raw, 0, width);
The performance characteristics is similar to CoderTao's solution.
Assuming that your raw data is a 1d array like:
byte[] imageBytes = new byte[1024];
// transform to bufferImage
BufferedImage bufferedImage = ImageIO.read(new ByteArrayInputStream(imageBytes));
// if you want to do some operations to the image, like resize,
// use the lib (net.coobird.thumbnailator)
BufferedImage image = Thumbnails.of(bufferedImage).forceSize(WIDTH, HEIGHT)
.outputFormat("bmp").asBufferedImage();