How to properly screenshot an android device? - java

The image that I get has a segment from the middle of the phone as an artifact on the right.
Android Code
I use MediaProjection and ImageReader.
//OnCreate
display = getWindowManager().getDefaultDisplay();
display.getSize(displaySize);
display.getMetrics(displayMetrics);
imageReader = ImageReader.newInstance(displaySize.x, displaySize.y,
PixelFormat.RGBA_8888, 2);
//onActivityResult
mediaProjection.createVirtualDisplay("screen", displaySize.x, displaySize.y,
displayMetrics.densityDpi, DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR,
imageReader.getSurface(), null, null);
//ScheduledThread
Image image = imageReader.acquireLatestImage();
if(image != null) {
final Image.Plane[] planes = image.getPlanes();
int pixelStride = planes[0].getPixelStride();
int rowStride = planes[0].getRowStride();
int rowPadding = rowStride - pixelStride * image.getWidth();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
Bitmap bitmap = Bitmap.createBitmap(image.getWidth() + rowPadding /
pixelStride, image.getHeight(), Bitmap.Config.ARGB_8888);
bitmap.copyPixelsFromBuffer(buffer);
byte[] byteArray;
if(bitmap != null) {
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG,50,byteArrayOutputStream);
byteArray = byteArrayOutputStream.toByteArray();
String encoded = Base64.encodeToString(byteArray, Base64.DEFAULT);
socket.emit("frame", encoded);
}
image.close();
}
Java Code
socket.on("frame", args1 -> {
byte[] bytes = Base64.getMimeDecoder().decode((String)args1[0]);
ByteArrayInputStream bin = new ByteArrayInputStream(bytes);
try {
BufferedImage img = ImageIO.read(bin);
int type = img.getType() == 0 ? BufferedImage.TYPE_INT_ARGB : img.getType();
img = resizeImage(img, type);
if(!frameOpen){
openFrame(img.getWidth());
}
panel.removeAll();
panel.add(new JLabel(new ImageIcon(img)));
frame.pack();
} catch (IOException ex) {
ex.printStackTrace();
}
});
private static BufferedImage resizeImage(BufferedImage originalImage, int type) {
double ratio = 1.0 * originalImage.getWidth() / originalImage.getHeight();
int imgHeight = Toolkit.getDefaultToolkit().getScreenSize().height - 100;
BufferedImage resizedImage = new BufferedImage((int) ((int)imgHeight*ratio), imgHeight, type);
Graphics2D g = resizedImage.createGraphics();
g.drawImage(originalImage, 0, 0, (int) ((int)imgHeight*ratio), imgHeight, null);
g.dispose();
return resizedImage;
}
Result
As you can see in the image, on the far right, that thin line is from the middle portion of the image.
What do you think is causing this?
EDIT: I know for a fact it's an error in the android code. Could my ImageReader or createVirtualDisplay be set up wrong?
I was able to get rid of the black bars by using:
display.getRealSize(displaySize);
But that line segment is still there:
Repo
https://github.com/Decapitated/Android-Viewer

Related

I can't seem to downscale pixelart to single pixels without antialiasing

I am trying to downscale an pixelart image (of the game stardew valley) consisting of 4x4 pixels per block of the same color to the same image with 1x1 pixel per block.
Photoshop is doing a great job when I just resize it with the NEAREST_NEIGHBOUR interpolation.
But when I use the technique from: How to scale a BufferedImage
but with TYPE_NEAREST_NEIGHBOR instead it gets all destorted.
What is going wrong and how should I go about fixing it?
BufferedImage old = getScreenShot();
int w = old.getWidth();
int h = old.getHeight();
int newWidth = w/4;
int newHeight = h/4;
BufferedImage resized = new BufferedImage(newWidth, newHeight, old.getType());
AffineTransform at = new AffineTransform();
at.scale(0.25, 0.25);
AffineTransformOp scaleOp = new AffineTransformOp(at, AffineTransformOp.TYPE_NEAREST_NEIGHBOR);
resized = scaleOp.filter(old, resized);
//TODO (remove) for debugging the screencapture capability
File outputfile = new File("C:/Users/Kevin/Desktop/imagetestmap/test.jpg");
try {
ImageIO.write(resized, "jpg", outputfile);
} catch (IOException e) {
e.printStackTrace();
}
return resized;
actual screenshot:
what photoshop sees:
what my program sees:
Cris Luengo found the solution, I should have used .png instead of jpg.
This is the working code:
BufferedImage old = getScreenShot();
int w = old.getWidth();
int h = old.getHeight();
int newWidth = w/4;
int newHeight = h/4;
BufferedImage resized = new BufferedImage(newWidth, newHeight, BufferedImage.TYPE_INT_ARGB);
AffineTransform at = new AffineTransform();
at.scale(0.25, 0.25);
AffineTransformOp scaleOp = new AffineTransformOp(at, AffineTransformOp.TYPE_NEAREST_NEIGHBOR);
resized = scaleOp.filter(old, resized);
//TODO (remove) for debugging the screencapture capability
File outputfile = new File("C:/Users/Kevin/Desktop/imagetestmap/test.png");
try {
ImageIO.write(resized, "png", outputfile);
} catch (IOException e) {
e.printStackTrace();
}

java.io.Image To InputStream

I'm resizing a image and I need to return a InputStream object
public InputStream resize(InputStream input, int maxSize){
BufferedImage image = ImageIO.read(input);
double scale = (double) image.getWidth()/maxSize;
Image scaledImage = image.getScaledInstance( (int) (image.getWidth() * scale), (int) (image.getHeight() * scale), Image.SCALE_SMOOTH);
InputStream ret = (InputStream) scaledImage;//this is wrong cast
retrun ret;
}
how can I convert a Image to a InputStream?
You can use this code for converting:
BufferedImage bufferedImage = new BufferedImage(image.getWidth(null), image.getHeight(null), BufferedImage.TYPE_INT_RGB);
//bufferedImage is the RenderedImage to be written
Graphics2D g2 = bufferedImage.createGraphics();
g2.drawImage(image, null, null);
ByteArrayOutputStream outStream = new ByteArrayOutputStream();
ImageIO.write(bufferedImage, "jpg", outStream);
InputStream is = new ByteArrayInputStream(outStream.toByteArray());

Image file doesn't overwrite unless restart the TomCat

I created a java web application. In which i created program to resize and upload photos. This stored in a specific folder. When I changes the Photo it doesn't change and remains the old photo. But after restarting the TomCat I can change the photo. Why would this happen? Here is the code to resize and store image
public static int createThumbnailNew(String original,
String resized, int maxSize) {
try
{
File originalFile = new File(original);
ImageIcon ii = new ImageIcon(originalFile.getCanonicalPath());
Image i = ii.getImage();
int iWidth = i.getWidth(null);
int iHeight = i.getHeight(null);
BufferedImage originalImage = new BufferedImage(
i.getWidth(null), i.getHeight(null),
BufferedImage.TYPE_INT_RGB);
// Copy image to buffered image.
Graphics g = originalImage.createGraphics();
// Clear background and paint the image.
g.setColor(Color.white);
g.fillRect(0, 0, i.getWidth(null), i.getHeight(null));
g.drawImage(i, 0, 0, null);
g.dispose();
BufferedImage bufferedImage = null;
if (iWidth > iHeight) {
bufferedImage = resizeImage(originalImage, BufferedImage.TYPE_INT_RGB,(maxSize * iHeight)/iWidth,maxSize);
} else {
bufferedImage = resizeImage(originalImage, BufferedImage.TYPE_INT_RGB,maxSize,(maxSize * iWidth) / iHeight);
}
//BufferedImage croppedImage=cropImage(bufferedImage,crX,crY,crH,crW);
File file = new File(resized);
FileOutputStream out = new FileOutputStream(file);
/* encodes image as a JPEG data stream */
JPEGImageEncoder encoder = JPEGCodec.createJPEGEncoder(out);
com.sun.image.codec.jpeg.JPEGEncodeParam param = encoder
.getDefaultJPEGEncodeParam(bufferedImage);
// writeParam = new JPEGImageWriteParam(null);
// writeParam.setCompressionMode(JPEGImageWriteParam.MODE_EXPLICIT);
// writeParam.setProgressiveMode(JPEGImageWriteParam.MODE_DEFAULT);
param.setQuality(1.0f, true);
encoder.setJPEGEncodeParam(param);
encoder.encode(bufferedImage);
}
catch(Exception e)
{
return -1;
}
return 0;
}
private static BufferedImage resizeImage(BufferedImage originalImage, int type,int h,int w){
BufferedImage resizedImage = new BufferedImage(w, h, type);
Graphics2D g = resizedImage.createGraphics();
g.drawImage(originalImage, 0, 0, w, h, null);
g.dispose();
return resizedImage;
}
Please Help.. Thanks..
You haven't closed your opened files. That is one possible problem here.

How do i create image from pixels stored as r, g, b each occupying different position in array?

I am reading a .jpg image and accesing the pixels as:
if (type == BufferedImage.TYPE_3BYTE_BGR) {
System.out.println("type.3byte.bgr");
byte[] pixels = (byte[]) sourceImage.getData().getDataElements(0, 0, w, h, null);
}
// process this array called pixels and display the resulting image
// first i convert it to integer
int offseet=0;
int[] data=new int[width*height*3];
for ( i = 0; i < data.length; i++) {
data[i] = pixels[offset++] & 0xff;
}
// and then process this array. For now I am not processing it. Now when i create a buffered
//image from data array and display it the image displayed is not the same.
// the code i use is
writeEdges(data);
private BufferedImage edgesImage;
private void writeEdges(int arb[]) {
if (edgesImage == null) {
edgesImage = new BufferedImage(w, h, BufferedImage.TYPE_INT_RGB);
}
edgesImage.getWritableTile(0, 0).setDataElements(0, 0, width,height, arb);
}
// for the penguins.jpg image(provided in sample pictures in windows 7)!
the output i get is
I do not know how is your code running because it would not compile as jcomeau_ictx pointed out. The problem is that you use different image type for reading and writing.
Here is a code snippet that would generate the same image as the input.
public static void main(String[] args) {
BufferedImage sourceImage = null;
try {
sourceImage = ImageIO.read(new File("IScSH.jpg"));
} catch (IOException e) {
}
int type = sourceImage.getType();
int w = sourceImage.getWidth();
int h = sourceImage.getHeight();
byte[] pixels = null;
if (type == BufferedImage.TYPE_3BYTE_BGR) {
System.out.println("type.3byte.bgr");
pixels = (byte[]) sourceImage.getData().getDataElements(0, 0, w, h, null);
}
try {
BufferedImage edgesImage = new BufferedImage(w, h, BufferedImage.TYPE_3BYTE_BGR);
edgesImage.getWritableTile(0, 0).setDataElements(0, 0, w, h, pixels);
ImageIO.write(edgesImage, "jpg", new File("nvRhP.jpg"));
} catch (IOException e) {
}
}

How to decrease image thumbnail size in java

I have an image a.jpg with size 4.7kb, when its upload to web-server, its size become 15.5KB ...
I am using following code.
BufferedImage bi = ImageIO.read(business.getImage()); //business is sturts2 Form & image instance of File class
int height = bi.getHeight();
int width = bi.getWidth();
if (height > Constants.BIZ_IMAGE_HEIGHT || width > Constants.BIZ_IMAGE_WIDTH) {
height = Constants.BIZ_IMAGE_HEIGHT;
width = Constants.BIZ_IMAGE_WIDTH;
}
InputStream is = UtilMethod.scaleImage(new FileInputStream(business.getImage()), width, height);
File f = new File(businessImagesPath, business.getImageFileName());
UtilMethod.saveImage(f, is);
is.close();
UtilMethod.scaleImage(..) ... is as follow:
public static InputStream scaleImage(InputStream p_image, int p_width, int p_height) throws Exception {
InputStream imageStream = new BufferedInputStream(p_image);
Image image = (Image) ImageIO.read(imageStream);
int thumbWidth = p_width;
int thumbHeight = p_height;
// Make sure the aspect ratio is maintained, so the image is not skewed
double thumbRatio = (double) thumbWidth / (double) thumbHeight;
int imageWidth = image.getWidth(null);
int imageHeight = image.getHeight(null);
double imageRatio = (double) imageWidth / (double) imageHeight;
if (thumbRatio < imageRatio) {
thumbHeight = (int) (thumbWidth / imageRatio);
} else {
thumbWidth = (int) (thumbHeight * imageRatio);
}
// Draw the scaled image
BufferedImage thumbImage = new BufferedImage(thumbWidth,
thumbHeight, BufferedImage.TYPE_INT_RGB);
Graphics2D graphics2D = (Graphics2D) thumbImage.createGraphics();
graphics2D.setRenderingHint(RenderingHints.KEY_INTERPOLATION,
RenderingHints.VALUE_INTERPOLATION_BILINEAR);
graphics2D.drawImage(image, 0, 0, thumbWidth, thumbHeight, Color.WHITE, null);
// Write the scaled image to the outputstream
ByteArrayOutputStream out = new ByteArrayOutputStream();
JPEGImageEncoder encoder = JPEGCodec.createJPEGEncoder(out);
JPEGEncodeParam param = encoder.getDefaultJPEGEncodeParam(thumbImage);
int quality = 85; // Use between 1 and 100, with 100 being highest quality
quality = Math.max(0, Math.min(quality, 100));
param.setQuality((float) quality / 100.0f, false);
encoder.setJPEGEncodeParam(param);
encoder.encode(thumbImage);
ImageIO.write(thumbImage, "png", out);
ByteArrayInputStream bis = new ByteArrayInputStream(out.toByteArray());
return bis;
}
Any other size and quality optimization idea while saving images using java. I am using struts2 MVC ... thank u so much.
int quality = 85; // Use between 1 and 100, with 100 being highest quality
This seems like a high quality for a JPEG thumbnail. Try around 60 or 50.
quality = Math.max(0, Math.min(quality, 100));
Huh?
param.setQuality((float) quality / 100.0f, false);
encoder.setJPEGEncodeParam(param);
OK..
encoder.encode(thumbImage);
ImageIO.write(thumbImage, "png", out);
But huh? Why set a JPEGEncodeParam and store as a PNG? Does that even have any effect? Try..
ImageIO.write(thumbImage, "jpg", out);

Categories

Resources