I am working on creating an image super-resolution application that uses a TensorFlow Lite model. The model gives the output Image in the form of ByteBuffer and I convert the ByteBuffer to Bitmap. Next, I display this Bitmap but nothing shows up. The code I am using can be seen below:
ByteBuffer out = ByteBuffer.allocate(4 * 384 * 384 * 3);
tflite.run(byteBuffer,out);
byte[] imageBytes= new byte[out.remaining()];
out.get(imageBytes);
final Bitmap outPut_Image = BitmapFactory.decodeByteArray(imageBytes,0,imageBytes.length);
//Toast.makeText(this,tflite.toString(),Toast.LENGTH_LONG).show();
Toast.makeText(this, "Working",Toast.LENGTH_LONG).show();
ImageView imageView = (ImageView) this.findViewById(R.id.imageView2);
imageView.setImageBitmap(outPut_Image);
Please advise me what I am doing wrong here.
BitmapFactory.decodeByteArray() is used for compressed image data such as JPEG.
For raw RGB888 data, I think you should convert it to ARGB_8888 format first.
After that you can use the following snippet to create Bitmap object.
Bitmap bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(buffer);
Related
In my app I try to store an image from an imageView as a byte array and store that in a databse
ImageView imageView = (ImageView) findViewById(R.id.imgpreview);
Bitmap bitmap = ((BitmapDrawable) imageView.getDrawable()).getBitmap();
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, baos);
byte[] byteArray = baos.toByteArray();
This byteArray is then stored in a database, and extracted back out inside a custom GridAdapter so that the byte array can be converted back to a bitmap and set as the image view.
ImageView image = (ImageView)gridView.findViewById(R.id.image1);
TextView description = (TextView)gridView.findViewById(R.id.tv1);
Bitmap bmp = BitmapFactory.decodeByteArray(byteArray.get(position), 0, byteArray.get(position).length);
image.setImageBitmap(bmp);
It doesn't seem to be a problem with storing or retrieving the byte array, as I log a toString of the byte array just before it attempts to decode it and it appears to be correctly logged, so I'm confused as to why it's being shown as null, the exact message is:
D/skia: --- SkImageDecoder::Factory returned null
I've looked through all the threads with solutions I could find and tried them all, I just can't figure out how to fix it. I've tried with varying image sizes. If any more info is needed let me know, thanks in advance.
I am trying to create a helper function using OpenCV Java API that would process an input image and return the output byte array. The input image is a jpg file saved in the computer. The input and output image are displayed in the Java UI using Swing.
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
// Load image from file
Mat rgba = Highgui.imread(filePath);
Imgproc.cvtColor(rgba, rgba, Imgproc.COLOR_RGB2GRAY, 0);
// Convert back to byte[] and return
byte[] return_buff = new byte[(int) (rgba.total() * rgba.channels())];
rgba.get(0, 0, return_buff);
return return_buff;
When the return_buff is returned and converted to BufferedImage I get NULL back. When I comment out the Imgproc.cvtColor function, the return_buff is properly converted to a BufferedImage that I can display. It seems like the Imgproc.cvtColor is returning a Mat object that I couldn't display in Java.
Here's my code to convert from byte[] to BufferedImage:
InputStream in = new ByteArrayInputStream(inputByteArray);
BufferedImage outputImage = ImageIO.read(in);
In above code, outputImage is NULL
Does anybody have any suggestions or ideas?
ImageIO.read(...) (and the javax.imageio package in general) is for reading/writing images from/to file formats. What you have is an array containing "raw" pixels. It's impossible for ImageIO to determine file format from this byte array. Because of this, it will return null.
Instead, you should create a BufferedImage from the bytes directly. I don't know OpenCV that well, but I'm assuming that the result of Imgproc.cvtColor(rgba, rgba, Imgproc.COLOR_RGB2GRAY, 0) will be an image in grayscale (8 bits/sample, 1 sample/pixel). This is the same format as BufferedImage.TYPE_BYTE_GRAY. If this assumption is correct, you should be able to do:
// Read image to Mat as before
Mat rgba = ...;
Imgproc.cvtColor(rgba, rgba, Imgproc.COLOR_RGB2GRAY, 0);
// Create an empty image in matching format
BufferedImage gray = new BufferedImage(rgba.width(), rgba.height(), BufferedImage.TYPE_BYTE_GRAY);
// Get the BufferedImage's backing array and copy the pixels directly into it
byte[] data = ((DataBufferByte) gray.getRaster().getDataBuffer()).getData();
rgba.get(0, 0, data);
Doing it this way, saves you one large byte array allocation and one byte array copy as a bonus. :-)
I used this kind of code to convert Mat object to Buffered Image.
static BufferedImage Mat2BufferedImage(Mat matrix)throws Exception {
MatOfByte mob=new MatOfByte();
Imgcodecs.imencode(".jpg", matrix, mob);
byte ba[]=mob.toArray();
BufferedImage bi=ImageIO.read(new ByteArrayInputStream(ba));
return bi;
}
I have a ByteArrayOutputStream I have created using the QRGen Qr code generation library, and I want to turn it into a BufferedImage object and present it in an ImageView in Android. How may that be achieved?
Convert the ByteArrayOutputStream to Byte array
byte[] data = baos.toByteArray(); //baos is your ByteArrayOutputStream object
them create a bitmap out of it using the BitmapFactory
Bitmap bmp = BitmapFactory.decodeByteArray (data,0,data.length, null);
them take your ImageView and set its bitmap
imageView.setImageBitmap(bmp);
I use following code for get bitmap from string but i get null bitmap. So please guide me.
byte[] Image_getByte = Base64.decode(img);
ByteArrayInputStream bytes = new ByteArrayInputStream(Image_getByte);
BitmapDrawable bmd = new BitmapDrawable(bytes);
Bitmap bitmap=bmd.getBitmap();
Log.v("log","Home bitmap "+bitmap);
i.setImageBitmap(bitmap);
use BitmapFactory.decodeByteArray(Image_getByte)
here you don't need any String, after getting the byte array, just pass it in the mentioned method which returns Bitmap
The javadoc for getBitmap() says:
"Returns the bitmap used by this drawable to render. May be null."
If img is base 64 String use the following source to get the bitmap from base 64 string.
byte[] source=img.getBytes();
byte[] Image_getByte = Base64.decode(source);
Bitmap bitmap = BitmapFactory.decodeByteArray(Image_getByte, 0,Image_getByte.length);
imageView.setImageBitmap(bitmap);
This may help you
Please consider 'ImageContents' as String which contains image data.
byte[] imageAsBytes = Base64.decode(ImageContents.getBytes());
ImageView image = (ImageView)this.findViewById(R.id.ImageView);
image.setImageBitmap(
BitmapFactory.decodeByteArray(imageAsBytes, 0, imageAsBytes.length)
);
ImageView :In Android, you can use “android.widget.ImageView” class to
display an image file
BitmapFactory:Creates Bitmap objects from various sources, including
files, streams, and byte-arrays.
For more information about BitmapFactory see here
I have data that represents an image and I want to create an ImageView
Exemple of image data :
zCH5BAAAAAAALAAAAADIABQAAAT/8MlJq7046827/2AojmRpnmiqrmzrvnAsz1ji3ARK3AMGOIbUgUYsjhA3RwFVuA0tO8ezpBj0jNgshpF0MGZN7wk5 ....
How can I do that programmatically ?
thxs
Either use API level 8 or copy the base 64 class, which you can find here:
http://developer.android.com/reference/android/util/Base64.html
base64String is a String representing the picture in a base64 format. ImageView is your imageview which you want the picture to be shown in. The String you posted earlier.
Then you just can use this snippet:
byte[] buf = Base64.decode(base64String,0);
BitmapFactory factory = new BitmapFactory();
Bitmap bitmap = factory.decodeByteArray(buf, 0, buf.length);
ImageView.setImageBitmap(bitmap);