Conversion ByteBuffer to Image give noisy output - java

I am trying to convert ByteBuffer to Bitmap Image but the output I get is noisy i.e not what I had expected. My code is as follows:
private Bitmap getOutputImage(ByteBuffer output){
output.rewind();
int outputWidth = 384;
int outputHeight = 384;
Bitmap bitmap = Bitmap.createBitmap(outputWidth, outputHeight, Bitmap.Config.RGB_565);
int [] pixels = new int[outputWidth * outputHeight];
for (int i = 0; i < outputWidth * outputHeight; i++) {
//val a = 0xFF;
//float a = (float) 0xFF;
//val r: Float = output?.float!! * 255.0f;
//byte val = output.get();
float r = ((float) output.get()) * 255.0f;
//val g: Float = output?.float!! * 255.0f;
float g = ((float) output.get()) * 255.0f;
//val b: Float = output?.float!! * 255.0f;
float b = ((float) output.get()) * 255.0f;
//pixels[i] = a shl 24 or (r.toInt() shl 16) or (g.toInt() shl 8) or b.toInt()
pixels[i] = (((int) r) << 16) | (((int) g) << 8) | ((int) b);
}
bitmap.setPixels(pixels, 0, outputWidth, 0, 0, outputWidth, outputHeight);
return bitmap;
}
The out image I am getting is
Please advise me what is wrong here?

output.get() is read 1byte from buffer.
maybe you have to change output.get() to output.getFloat()
then well work.
this is my code.
ByteBuffer modelOutput = ByteBuffer.allocateDirect(200 * 200 * 3 * 4).order(ByteOrder.nativeOrder());
Interpreter tflite = getTfliteInterpreter("ESRGAN.tflite");
tflite.run(input, modelOutput);
modelOutput.rewind();
int outputWidth = 200;
int outputHeight = 200;
Bitmap bitmap2 = Bitmap.createBitmap(outputWidth, outputHeight, Bitmap.Config.ARGB_8888);
int [] pixels = new int[outputWidth * outputHeight];
for (int i = 0; i < outputWidth * outputHeight; i++) {
int a = 0xFF;
float r = (modelOutput.getFloat());
float g = (modelOutput.getFloat());
float b = (modelOutput.getFloat());
pixels[i] = a << 24 | ((int) r << 16) | ((int) g << 8) | (int) b;
}
bitmap2.setPixels(pixels, 0, outputWidth, 0, 0, outputWidth, outputHeight);

Related

How to recolorize an image in JavaFX

I have the following constructor for a RecoloredImage that takes an old image, and replaces every old colored pixel with a new colored pixel. However, the image doesn't actually change. The code between the comments is purely for testing purposes, and the resulting printed line is not at all the new color I want.
public RecoloredImaged(Image inputImage, Color oldColor, Color newColor) {
int width = (int) inputImage.getWidth();
int height = (int) inputImage.getHeight();
WritableImage outputImage = new WritableImage(width, height);
PixelReader reader = inputImage.getPixelReader();
PixelWriter writer = outputImage.getPixelWriter();
// -- testing --
PixelReader newReader = outputImage.getPixelReader();
// -- end testing --
int ob = (int) oldColor.getBlue() * 255;
int or = (int) oldColor.getRed() * 255;
int og = (int) oldColor.getGreen() * 255;
int nb = (int) newColor.getBlue() * 255;
int nr = (int) newColor.getRed() * 255;
int ng = (int) newColor.getGreen() * 255;
for (int y = 0; y < height; y++) {
for (int x = 0; x < width; x++) {
int argb = reader.getArgb(x, y);
int a = (argb >> 24) & 0xFF;
int r = (argb >> 16) & 0xFF;
int g = (argb >> 8) & 0xFF;
int b = argb & 0xFF;
if (g == og && r == or && b == ob) {
r = nr;
g = ng;
b = nb;
}
argb = (a << 24) | (r << 16) | (g << 8) | b;
writer.setArgb(x, y, argb);
// -- testing --
String s = Integer.toHexString(newReader.getArgb(x, y));
if (!s.equals("0"))
System.out.println(s);
// -- end testing --
}
}
image = outputImage;
}
The cast operator has a higher precedence than the multiplication operator. Your calculations for the or, ..., nb values are therefore compiled to the same bytecode as this code:
int ob = ((int) oldColor.getBlue()) * 255;
int or = ((int) oldColor.getRed()) * 255;
int og = ((int) oldColor.getGreen()) * 255;
int nb = ((int) newColor.getBlue()) * 255;
int nr = ((int) newColor.getRed()) * 255;
int ng = ((int) newColor.getGreen()) * 255;
Just add brackets to tell java to do the multiplication before casting. Otherwise you'll only get values 0 or 255 as results.
int ob = (int) (oldColor.getBlue() * 255);
int or = (int) (oldColor.getRed() * 255);
int og = (int) (oldColor.getGreen() * 255);
int nb = (int) (newColor.getBlue() * 255);
int nr = (int) (newColor.getRed() * 255);
int ng = (int) (newColor.getGreen() * 255);

Problem converting YUV to RGB ImageReader from android Camera2 using OpenCV, output image is in grayscale

I'm trying to convert an image from YUV to RGB inside onImageAvailable method in java.
I'm using openCV for conversion.
I can't use RGB format from android Camera2 for avoiding frame loss.
I can't chose the best format for conversion.
Image.Plane Y = image.getPlanes()[0];
Image.Plane U = image.getPlanes()[1];
Image.Plane V = image.getPlanes()[2];
Y.getBuffer().position(0);
U.getBuffer().position(0);
V.getBuffer().position(0);
int Yb = Y.getBuffer().remaining();
int Ub = U.getBuffer().remaining();
int Vb = V.getBuffer().remaining();
ByteBuffer buffer = ByteBuffer.allocateDirect( Yb + Ub + Vb);
buffer.put(Y.getBuffer());
buffer.put(U.getBuffer());
buffer.put(V.getBuffer());
// Image is 640 x 480
Mat yuvMat = new Mat(960, 640, CvType.CV_8UC1);
yuvMat.put(0, 0, buffer.array());
// I don't know what is the correct format
Mat rgbMat = new Mat(yuvMat.rows, yuvMat.cols, CvType.CV_8UC4);
Imgproc.cvtColor(yuvMat, rgbMat, Imgproc.COLOR_YUV420sp2RGBA);
final Bitmap bit = Bitmap.createBitmap(rgbMat.cols(), rgbMat.rows(), Bitmap.Config.ARGB_8888);
Utils.matToBitmap(rgbMat, bit);
Actually, I obtain only cropped grayscale image
Try this function:
void decodeYUV420SP( byte[] rgb, byte[] yuv420sp, int width, int height )
{
final int frameSize = width * height;
for (int j = 0, yp = 0; j < height; j++) {
int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
for (int i = 0; i < width; i++, yp++) {
int y = (0xff & ((int) yuv420sp[yp])) - 16;
if (y < 0)
y = 0;
if ((i & 1) == 0) {
v = (0xff & yuv420sp[uvp++]) - 128;
u = (0xff & yuv420sp[uvp++]) - 128;
}
int y1192 = 1192 * y;
int r = (y1192 + 1634 * v);
int g = (y1192 - 833 * v - 400 * u);
int b = (y1192 + 2066 * u);
if (r < 0) r = 0; else if (r > 262143)
r = 262143;
if (g < 0) g = 0; else if (g > 262143)
g = 262143;
if (b < 0) b = 0; else if (b > 262143)
b = 262143;
//rgb[yp] = 0xff000000 | ((r << 6) & 0xff0000) | ((g >> 2) & 0xff00) | ((b >> 10) & 0xff);
int nIdx = ((width - i - 1) * height + height - j - 1) * 3;//device
//int nIdx = (i * height + j) * 3;//nox
rgb[nIdx] = (byte) (((r << 6) & 0xff0000)>>16);
rgb[nIdx+1] = (byte) (((g >> 2) & 0xff00)>>8);
rgb[nIdx+2] = (byte) ((b >> 10) & 0xff);
}
}
}
Use : decodeYUV420SP( rgb, camData, nWidth234, nHeight234 );
You can get RGB byte array;
If you need get the image from byte array, try this.
public boolean convertYunToJpeg(byte[] data, int width, int height){
YuvImage image = new YuvImage(data, ImageFormat.NV21, width, height, null);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
int quailty = 20;
image.compressToJpeg(new Rect(0,0, width, height), quailty, baos);
byte[] jpegByteArray = baos.toByteArray();
Bitmap bitmap = BitmapFactory.decodeByteArray(jpegByteArray, 0, jpegByteArray.length);
Matrix matrix = new Matrix();
matrix.postRotate(-90);
Bitmap lastbitmap = Bitmap.createBitmap(bitmap, 0, 0, bitmap.getWidth(), bitmap.getHeight(), matrix, true);
try {
File file = new File(BaseApplication.DIRECTORY + mCode + ".png");
if(!file.exists()){
RandomAccessFile me = new RandomAccessFile(BaseApplication.DIRECTORY + mCode + ".png", "rw");
me.writeInt(5);
me.close();
file = new File(BaseApplication.DIRECTORY + mCode + ".png");
}
FileOutputStream fos = new FileOutputStream(file);
lastbitmap.compress(Bitmap.CompressFormat.PNG, quailty, fos);
} catch (IOException e) {
e.printStackTrace();
return false;
}
return true;
}

Convert JPEG byte[] to NV21 byte[]

I am trying to convert JPEG byte[] data from Camera.PictureCallback to NV21 byte[] format but it didn't work.
I try to do this:
byte [] getNV21(int inputWidth, int inputHeight, Bitmap scaled) {
int [] argb = new int[inputWidth * inputHeight];
scaled.getPixels(argb, 0, inputWidth, 0, 0, inputWidth, inputHeight);
byte [] yuv = new byte[inputWidth*inputHeight*3/2];
encodeYUV420SP(yuv, argb, inputWidth, inputHeight);
scaled.recycle();
return yuv;
}
void encodeYUV420SP(byte[] yuv420sp, int[] argb, int width, int height) {
final int frameSize = width * height;
int yIndex = 0;
int uvIndex = frameSize;
int a, R, G, B, Y, U, V;
int index = 0;
for (int j = 0; j < height; j++) {
for (int i = 0; i < width; i++) {
a = (argb[index] & 0xff000000) >> 24; // a is not used obviously
R = (argb[index] & 0xff0000) >> 16;
G = (argb[index] & 0xff00) >> 8;
B = (argb[index] & 0xff) >> 0;
// well known RGB to YUV algorithm
Y = ( ( 66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
U = ( ( -38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
V = ( ( 112 * R - 94 * G - 18 * B + 128) >> 8) + 128;
// NV21 has a plane of Y and interleaved planes of VU each sampled by a factor of 2
// meaning for every 4 Y pixels there are 1 V and 1 U. Note the sampling is every other
// pixel AND every other scanline.
yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
if (j % 2 == 0 && index % 2 == 0) {
yuv420sp[uvIndex++] = (byte)((V<0) ? 0 : ((V > 255) ? 255 : V));
yuv420sp[uvIndex++] = (byte)((U<0) ? 0 : ((U > 255) ? 255 : U));
}
index ++;
}
}
}

OpenGL (LWJGL) - Get Pixel Color of Texture

How do I go about getting the pixel color of an RGBA texture? Say I have a function like this:
public Color getPixel(int x, int y) {
int r = ...
int g = ...
int b = ...
int a = ...
return new Color(r, g, b, a);
}
I'm having a hard time using glGetTexImage() to work;
int[] p = new int[size.x * size.y * 4];
ByteBuffer buffer = ByteBuffer.allocateDirect(size.x * size.y * 16);
glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
buffer.asIntBuffer().get(p);
for (int i = 0; i < p.length; i++) {
p[i] = (int) (p[i] & 0xFF);
}
But I don't know how to access a pixel with a given coordinate.
like this? hope this helps you :)
public Color getPixel(BufferedImage image, int x, int y) {
ByteBuffer buffer = BufferUtils.createByteBuffer(image.getWidth() *
image.getHeight() * 4); //4 for RGBA, 3 for RGB
if (y <= image.getHeight() && x <= image.getWidth()){
int pixel = pixels[y * image.getWidth() + x];
int r=(pixel >> 16) & 0xFF); // Red
int g=(pixel >> 8) & 0xFF); // Green
int b=(pixel & 0xFF); // Blue
int a=(pixel >> 24) & 0xFF); // Alpha
return new Color(r,g,b,a)
}
else{
return new Color(0,0,0,1);
}
}
its not testet but should work
Here's what I did to accomplish this.
First, I set the pixels in a byte[] with glGetTexImage.
byte[] pixels = new byte[size.x * size.y * 4];
ByteBuffer buffer = ByteBuffer.allocateDirect(pixels.length);
glGetTexImage(GL_TEXTURE_2D, 0, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
buffer.get(pixels);
Then, to get a pixel at a specific coordinate, here's the algorithm I used:
public Color getPixel(int x, int y) {
if (x > size.x || y > size.y) {
return null;
}
int index = (x + y * size.x) * 4;
int r = pixels[index] & 0xFF;
int g = pixels[index + 1] & 0xFF;
int b = pixels[index + 2] & 0xFF;
int a = pixels[index + 3] & 0xFF;
return new Color(r, g, b, a);
}
This returns a Color object with arguments ranging from 0-255, as expected.

problem with combining RGB binary image

im doing an edge detection which will detect edges of each RGB channel and then combine them to show it as a final output. im now having a problem with combining the three as it doesnt show me a binary image, instead it has some colors on it. i have checked each binary image of the RGB and it works fine which gives the black and white image. to be clearer, following is the code:
private void processActionPerformed(java.awt.event.ActionEvent evt) {
width = inputimage.getWidth(null);
height = inputimage.getHeight(null);
inputbuff = new BufferedImage(width,height,BufferedImage.TYPE_INT_RGB);
Graphics r = inputbuff.getGraphics();
r.drawImage(inputimage, 0, 0, null);
r.dispose();
process_red = new int[width * height];
process_green = new int[width * height];
process_blue = new int[width * height];
process_grey = new int[width * height];
process_rgb = new int[width * height];
process_combine = new int[width * height];
for (int i = 0; i < 256; i++) {
freq_red[i] = freq_green[i] = freq_blue[i] = freq_grey[i] = 0;
}
for (int x = 0; x < width; x++) {
for (int y = 0; y < height; y++) {
int clr = inputbuff.getRGB(y, x);
int red = (clr & 0x00ff0000) >> 16;
int green = (clr & 0x0000ff00) >> 8;
int blue = clr & 0x000000ff;
int grey = (11 * red + 16 * green + 5 * blue) / 32;
freq_red[red] += 1;
freq_green[green] += 1;
freq_blue[blue] += 1;
freq_grey[grey] += 1;
}
}
int threshold = 150;
for (int i = 0; i < 256; i++) {
freq_red[i] = applyThreshold(threshold, freq_red[i]);
freq_green[i] = applyThreshold(threshold, freq_green[i]);
freq_blue[i] = applyThreshold(threshold, freq_blue[i]);
freq_grey[i] = applyThreshold(threshold, freq_grey[i]);
}
for (int x = 0; x < width; x++) {
for (int y = 0; y < height; y++) {
int clr = inputbuff.getRGB(y, x);
int red = (clr & 0x00ff0000) >> 16;
int green = (clr & 0x0000ff00) >> 8;
int blue = clr & 0x000000ff;
int grey = (11 * red + 16 * green + 5 * blue) / 32;
red = freq_red[red];
green = freq_green[green];
blue = freq_blue[blue];
grey = freq_grey[grey];
int alpha = 0xff000000;
int combine = alpha | (red <<16) |(green <<8)|blue;
process_red[x * height + y] = (0xFF<<24)|(red<<16)|(red<<8)|red;
process_green[x * height + y] = (0xFF<<24)|(green<<16)|(green<<8)|green;
process_blue[x * height + y] = (0xFF<<24)|(blue<<16)|(blue<<8)|blue;
process_grey[x * height + y] = (0xFF<<24)|(grey<<16)|(grey<<8)|grey;
process_rgb[x * height + y] = clr;
process_combine[x * height + y] = combine;
}
}
image_red = new JFrame().createImage(new MemoryImageSource(width, height, process_red, 0, width));
image_green = new JFrame().createImage(new MemoryImageSource(width, height, process_green, 0, width));
image_blue = new JFrame().createImage(new MemoryImageSource(width, height, process_blue, 0, width));
image_grey = new JFrame().createImage(new MemoryImageSource(width, height, process_grey, 0, width));
image_rgb = new JFrame().createImage(new MemoryImageSource(width, height, process_rgb, 0, width));
image_combine = new JFrame().createImage(new MemoryImageSource(width, height, process_combine, 0, width));
buff_red = new BufferedImage(width,height, BufferedImage.TYPE_INT_RGB);
buff_green = new BufferedImage(width,height, BufferedImage.TYPE_INT_RGB);
buff_blue = new BufferedImage(width,height, BufferedImage.TYPE_INT_RGB);
buff_grey = new BufferedImage(width,height, BufferedImage.TYPE_INT_RGB);
buff_rgb = new BufferedImage(width,height, BufferedImage.TYPE_INT_RGB);
buff_combine = new BufferedImage(width,height, BufferedImage.TYPE_INT_RGB);
graph_red = buff_red.getGraphics();
graph_green = buff_green.getGraphics();
graph_blue = buff_blue.getGraphics();
graph_grey = buff_grey.getGraphics();
graph_rgb = buff_rgb.getGraphics();
graph_combine = buff_combine.getGraphics();
graph_red.drawImage(image_red, 0, 0, null);
graph_green.drawImage(image_green, 0, 0, null);
graph_blue.drawImage(image_blue, 0, 0, null);
graph_grey.drawImage(image_grey, 0, 0, null);
graph_rgb.drawImage(image_rgb, 0, 0, null);
graph_combine.drawImage(image_combine, 0, 0, null);
graph_red.dispose();
graph_green.dispose();
graph_blue.dispose();
graph_grey.dispose();
graph_rgb.dispose();
graph_combine.dispose();
repaint();
}
i suspected that the problem is with the alpha value:
int alpha = 0xff000000;
int combine = alpha | (red <<16) | (green <<8)|blue;
however, when i removed the alpha value it doesnt display anything. can anyone please help me? thanks in advance!
I am guessing that freq_red etc. are byte arrays. If so then you are being bitten by byte sign extension.
Try replacing this
red = freq_red[red];
green = freq_green[green];
blue = freq_blue[blue];
grey = freq_grey[grey];
with this:
red = freq_red[red] & 0xFF;
green = freq_green[green] & 0xFF;
blue = freq_blue[blue] & 0xFF;
grey = freq_grey[grey] & 0xFF;
Update: your method is longer than it needs to be because of all the temporary images (graph_red etc.) You can avoid them by defining a method like this:
private BufferedImage wrapPixelArray(int width,
int height,
int[] process) {
DataBuffer db = new DataBufferInt(process, width * height);
SampleModel sm =
new SinglePixelPackedSampleModel(DataBuffer.TYPE_INT, width, height, MASK);
WritableRaster wr =
Raster.createWritableRaster(sm, db, null);
return new BufferedImage(RGB, wr, false, null);
}
private static final int[] MASK = {0xFF0000, 0xFF00, 0xFF};
private static final ColorModel RGB =
new DirectColorModel(32, MASK[0], MASK[1], MASK[2]);

Categories

Resources