In my project we need spot the difference among a set of images,so at first i tried it for three images and I have written code to differentiate between three images based on RGB values.I have stored coordinate values from this values i need to get a image.
import java.io.*;
import java.awt.*;
import javax.imageio.ImageIO;
import java.awt.image.BufferedImage;
class spe
{
public static void main(String args[])
throws IOException
{
long start = System.currentTimeMillis();
int q=0;
File file1 = new File("filename.txt");
FileWriter fw = new FileWriter(file1.getAbsoluteFile());
BufferedWriter bw = new BufferedWriter(fw);
File file= new File("2010.png");
BufferedImage image = ImageIO.read(file);
int width = image.getWidth(null);
int height = image.getHeight(null);
int[][] clr= new int[width][height];
File files= new File("2011.png");
BufferedImage images = ImageIO.read(files);
int widthe = images.getWidth(null);
int heighte = images.getHeight(null);
File file2=new File("2009.png");
BufferedImage image2=ImageIO.read(file2);
int wid=image2.getWidth(null);
int heig=image2.getHeight(null);
int[][] colo=new int[wid][heig];
int[][] clre= new int[widthe][heighte];
int smw=0;
int smh=0;
int p=0;
// bw.write("hai");
//CALUCLATING THE SMALLEST VALUE AMONG WIDTH AND HEIGHT
if(width>widthe)
{
smw =widthe;
}
else
{
smw=width;
}
if(height>heighte)
{
smh=heighte;
}
else
{
smh=height;
}
//CHECKING NUMBER OF PIXELS SIMILARITY
for(int a=0;a<smw;a++)
{
for(int b=0;b<smh;b++)
{
clre[a][b]=images.getRGB(a,b);
clr[a][b]=image.getRGB(a,b);
colo[a][b]=image2.getRGB(a,b);
if(clr[a][b]==clre[a][b] && colo[a][b]==clre[a][b])
{
p=p+1;
bw.write("\t");
bw.write(Integer.toString(a));
bw.write("\t");
bw.write(Integer.toString(b));
bw.write("\n");
System.out.println(a+"\t"+b);
}
else
q=q+1;
}
}
float w,h=0;
if(width>widthe)
{
w=width;
}
else
{
w=widthe;
}
if(height>heighte)
{
h = height;
}
else
{
h = heighte;
}
float s = (smw*smh);
//CALUCLATING PERCENTAGE
float x =(100*p)/s;
System.out.println("THE PERCENTAGE SIMILARITY IS APPROXIMATELY ="+x+"%");
long stop = System.currentTimeMillis();
System.out.println("TIME TAKEN IS ="+(stop-start));
System.out.println("NO OF PIXEL GETS VARIED:="+q);
System.out.println("NO OF PIXEL GETS MATCHED:="+p);
}
}
Related
Here is the code, I am getting black image.
package example;
import java.awt.image.BufferedImage;
import java.io.File;
import javax.imageio.ImageIO;
public class imageCopy {
public static void main(String[] args) {
BufferedImage img = null;
File f = null;
try {
f = new File("E:\\unnamed.png");
img = ImageIO.read(f);
}catch(Exception e) {
e.printStackTrace();
}
int width = img.getWidth();
int height = img.getHeight();
for(int i=0;i<height;i++) {
for(int j=0;j<width;j++) {
int p = img.getRGB(j, i);
int k = p << -2 >>> -2;
img.setRGB(j, i, k);
}
}
try {
f = new File("E:\\Output.png");
ImageIO.write(img, "png", f);
}catch(Exception e) {
e.printStackTrace();
}
}
}
png
I have an grayscale image with dimension 256*256.I am trying to downscale it to 128*128.
I am taking an average of two pixel and writing it to the ouput file.
class Start {
public static void main (String [] args) throws IOException {
File input= new File("E:\\input.raw");
File output= new File("E:\\output.raw");
new Start().resizeImage(input,output,2);
}
public void resizeImage(File input, File output, int downScaleFactor) throws IOException {
byte[] fileContent= Files.readAllBytes(input.toPath());
FileOutputStream stream= new FileOutputStream(output);
int i=0;
int j=1;
int result=0;
for(;i<fileContent.length;i++)
{
if(j>1){
// skip the records.
j--;
continue;
}
else {
result = fileContent[i];
for (; j < downScaleFactor; j++) {
result = ((result + fileContent[i + j]) / 2);
}
j++;
stream.write( fileContent[i]);
}
}
stream.close();
}
}
Above code run successfully , I can see the size of output file size is decreased but when I try to convert
output file (raw file) to jpg online (https://www.iloveimg.com/convert-to-jpg/raw-to-jpg) it is giving me an error saying that file is corrupt.
I have converted input file from same online tool it is working perfectly. Something is wrong with my code which is creating corrupt file.
How can I correct it ?
P.S I can not use any library which directly downscale an image .
Your code is not handling image resizing.
See how-to-resize-images-in-java.
Which, i am copying a simple version here:
import java.awt.Graphics2D;
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.IOException;
import javax.imageio.ImageIO;
public class ImageResizer {
public static void resize(String inputImagePath,
String outputImagePath, int scaledWidth, int scaledHeight)
throws IOException {
// reads input image
File inputFile = new File(inputImagePath);
BufferedImage inputImage = ImageIO.read(inputFile);
// creates output image
BufferedImage outputImage = new BufferedImage(scaledWidth,
scaledHeight, inputImage.getType());
// scales the input image to the output image
Graphics2D g2d = outputImage.createGraphics();
g2d.drawImage(inputImage, 0, 0, scaledWidth, scaledHeight, null);
g2d.dispose();
// extracts extension of output file
String formatName = outputImagePath.substring(outputImagePath
.lastIndexOf(".") + 1);
// writes to output file
ImageIO.write(outputImage, formatName, new File(outputImagePath));
}
public static void resize(String inputImagePath,
String outputImagePath, double percent) throws IOException {
File inputFile = new File(inputImagePath);
BufferedImage inputImage = ImageIO.read(inputFile);
int scaledWidth = (int) (inputImage.getWidth() * percent);
int scaledHeight = (int) (inputImage.getHeight() * percent);
resize(inputImagePath, outputImagePath, scaledWidth, scaledHeight);
}
public static void main(String[] args) {
String inputImagePath = "resources/snoopy.jpg";
String outputImagePath1 = "target/Puppy_Fixed.jpg";
String outputImagePath2 = "target/Puppy_Smaller.jpg";
String outputImagePath3 = "target/Puppy_Bigger.jpg";
try {
// resize to a fixed width (not proportional)
int scaledWidth = 1024;
int scaledHeight = 768;
ImageResizer.resize(inputImagePath, outputImagePath1, scaledWidth, scaledHeight);
// resize smaller by 50%
double percent = 0.5;
ImageResizer.resize(inputImagePath, outputImagePath2, percent);
// resize bigger by 50%
percent = 1.5;
ImageResizer.resize(inputImagePath, outputImagePath3, percent);
} catch (IOException ex) {
System.out.println("Error resizing the image.");
ex.printStackTrace();
}
}
}
I was using openCv to apply filters and ran into a problem.
What I did in this specific code was first try and get a grayscale image but preserve the color channels it works fine but when I write to file using Image IO I find the alpha has been altered. So I checked for BGRA and ABGR colorspaces but it still does not work and gives me a transparent image.
public static BufferedImage sepia(BufferedImage image,int intensity)
{
Mat imageMat = bufferedToMat(image);
int sepiaDepth = 20;
int width = image.getWidth();
int height = image.getHeight();
Mat grayScaleMat = new Mat(imageMat.height(),imageMat.width(),CvType.CV_8UC4);
imageMat.copyTo(grayScaleMat);
// double[] test = imageMat.get(0, 0);
// System.out.println(test[0]+" "+test[1]+" "+test[2]+" "+test[3]);
for(int i=0;i<grayScaleMat.cols();i++)
{
for(int j=0;j<grayScaleMat.rows();j++)
{
//can be optimised
double[] data = grayScaleMat.get(j, i);
//System.out.println(data.length);
double blue = data[0];
double green = data[1];
double red = data[2];
//System.out.println(red+" "+blue+" "+green);
double gray = (red + blue + green)/3.0;
//data[0] = gray;
data[0] = gray;
data[1] = gray;
data[2] = gray;
grayScaleMat.put(j, i, data);
}
}
return (Utility.matToBuffered(grayScaleMat));
}
//Only Testing Remove Later
public static void main(String[] args)
{
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
int beta = 25;
String imagePath = "/home/arjun/Pictures/Lenna.png";
BufferedImage image = null;
try{
image = ImageIO.read(new File(imagePath));
}catch(IOException e)
{
e.printStackTrace();
}
int x = image.getType();
System.out.println(x);
BufferedImage output = sepia(image,beta);
int y = output.getType();
System.out.println(y);
File outputfile = new File("/home/arjun/Pictures/sepia2.png");
try{
ImageIO.write(output, "png", outputfile);
}catch(IOException e)
{
e.printStackTrace();
}
}
And the Buffered and Mat conversions here
public static Mat bufferedToMat(BufferedImage image)
{
byte[] pixels = ((DataBufferByte)image.getRaster().getDataBuffer()).getData();
Mat imageMat = new Mat(image.getHeight(),image.getWidth(),CvType.CV_8UC4);
imageMat.put(0, 0, pixels);
return imageMat;
}
public static BufferedImage matToBuffered(Mat imageMat)
{
BufferedImage out;
byte[] data = new byte[imageMat.cols()*imageMat.rows()*(int)imageMat.elemSize()];
imageMat.get(0, 0,data);
int type = BufferedImage.TYPE_3BYTE_BGR;
if(imageMat.channels() == 1)
{
type = BufferedImage.TYPE_BYTE_GRAY;
}
else if(imageMat.channels() == 3)
{
type = BufferedImage.TYPE_3BYTE_BGR;
}
else if(imageMat.channels() == 4)
{
type = BufferedImage.TYPE_4BYTE_ABGR;
}
out = new BufferedImage(imageMat.cols(),imageMat.rows(),type);
out.getRaster().setDataElements(0,0,imageMat.cols(),imageMat.rows(),data);
return out;
}
Input Image:
Output Image:
import javax.imageio.ImageIO;
import org.bytedeco.javacv.FFmpegFrameGrabber;
public class FrameData
{
int count = 0;
int picWidth;
int picHeight;
BufferedImage img = null;
//GET FRAME COUNT
public int gf_count(int numofFrames, BufferedImage[] frameArray, String fileLocationsent, String videoNamesent) throws IOException
{
String fileLocation = fileLocationsent;
String videoName = videoNamesent;
int frameNums = numofFrames;
int totFrames = 0;
FFmpegFrameGrabber grab = new FFmpegFrameGrabber(fileLocation + videoName);
try { grab.start(); }
catch (Exception e) { System.out.println("Unable to grab frames"); }
for(int i = 0 ; i < frameNums ; i++)
{
try
{
frameArray[i]= grab.grab().getBufferedImage();
totFrames = i;
File outputfile = new File(fileLocation + "GrayScaledImage" + i + ".jpg");
ImageIO.write(frameArray[i], "jpg", outputfile);
}
catch (Exception e) { /*e.printStackTrace();*/ }
}//END for
return totFrames;
}//END METHOD long getFrameCount()
Hope someone can explain this to me...
I am just learning java so here goes...
I wrote this code to count the number of frames in a .mov file and to test my buffered image array I generated files of the images. As the code is, it works as planned... The problem is immediately after the capturing, if I send the bufferedimages out as files, they all seem to be just the first image. see example below...
for(int i = 0 ; i < frameNums ; i++)
{
try
{
frameArray[i]= grab.grab().getBufferedImage();
totFrames = i;
File outputfile = new File(fileLocation + "GrayScaledImage" + i + ".jpg");
ImageIO.write(frameArray[i], "jpg", outputfile);
}
catch (Exception e) { /*e.printStackTrace();*/ }
}//END for
And now if I change that to...
for(int i = 0 ; i < frameNums ; i++)
{
try
{
frameArray[i]= grab.grab().getBufferedImage();
totFrames = i; catch (Exception e) { /*e.printStackTrace();*/ }}
for(int j = 0; j < frameNums; j++)
{
File outputfile = new File(fileLocation + "GrayScaledImage" + j + ".jpg");
ImageIO.write(frameArray[j], "jpg", outputfile);
}
I don't understand why I am getting the same image repeatedly.
If further information Is required, just lemme know, this is my first programming question online... Usually find what I am looking for that others have asked. Couldn't find this one.
Thanks for your time
Ken
The problem is that the grab().getBufferedImage() does its work in the same buffer every time. When you assign a reference to that buffer in your loop, you are assigning a reference to the same buffer numofFrames times. What you are writing then is not the first frame, but the last frame. In order to fix this you need to do a "deep copy" of the BufferedImage. See code below:
public class FrameData {
BufferedImage img;
Graphics2D g2;
// GET FRAME COUNT
public int gf_count(int numFrames, BufferedImage[] frameArray, String fileLocation, String videoName) throws Exception, IOException {
Java2DFrameConverter converter = new Java2DFrameConverter();
int totFrames = 0;
img = new BufferedImage(100, 50, BufferedImage.TYPE_INT_ARGB);
g2 = img.createGraphics();
FFmpegFrameGrabber grab = new FFmpegFrameGrabber(fileLocation + videoName);
grab.start();
for (int i = 0; i < numFrames; i++) {
frameArray[i] = deepCopy(converter.convert(grab.grab()));
totFrames = i;
}
for (int j = 0; j < totFrames; j++) {
File outputfile = new File(fileLocation + "TImage" + j + ".jpg");
ImageIO.write(frameArray[j], "jpg", outputfile);
}
return totFrames;
}// END METHOD long getFrameCount()
BufferedImage deepCopy(BufferedImage bi) {
ColorModel cm = bi.getColorModel();
boolean isAlphaPremultiplied = cm.isAlphaPremultiplied();
WritableRaster raster = bi.copyData(null);
return new BufferedImage(cm, raster, isAlphaPremultiplied, null);
}
// This does what the converter.convert seems to do, which
// is decode an image into the same place over and over.
// if you don't copy the result every time, then you end up
// with an array of references to the same last frame.
BufferedImage draw() {
g2.setColor(new Color(-1));
g2.fillRect(0, 0, 100, 50);
g2.setColor(new Color(0));
g2.drawLine(0, 0, (int)(Math.random()*100.0), (int)(Math.random()*50.0));
return img;
}
public static void main(String... args) throws Exception, IOException {
new FrameData().run();
}
private void run() throws Exception, IOException {
BufferedImage images[] = new BufferedImage[50];
gf_count(50, images, "C:/Users/karl/Videos/", "dance.mpg");
}
}
I have included a draw() method that shows by example how work is done in the same BufferedImage repeatedly, in case you want to replicate the problem.
There are certainly other ways to do a deep copy and there may be issues with the one shown. Reference: How do you clone a BufferedImage.
PS> I updated the code to use the 1.1 version of the bytedeco library.
i have written a program to encrypt an image in Netbeans. The program works fine when running from netbeans but when i build it into a .jar file its not working, it cannot read the image even though i placed the image file in the same folder as the .jar file.
package test;
import java.io.IOException;
import java.io.File;
/**
*
* #author AMaR
*/
public class Test {
/**
* #param args the command line arguments
*/
public static void main(String[] args) throws IOException, Exception {
File EnImage = new File("encrypted.png");
File DeImage = new File("decrypted.png");
int[] pixels;
LoadImage l = new LoadImage();
l.load();
pixels= l.getImagePixels();
RC4New rc4 = new RC4New();
int key[]= {13,2,4,6,};
// int data[]={5,10,90,5};
rc4.KSA(key);
int[] text = rc4.PRNG(pixels);
l.write((int)512,(int)512,text,EnImage);
//RC4New rc41 = new RC4New();
rc4.KSA(key);
int[] text1 = rc4.PRNG(text);
l.write((int)512,(int)512,text1,DeImage);
/* for(int i=0;i<text.length;i++){
System.out.println(text[i]);
}
RC4New rc41 = new RC4New();
rc4.KSA(key);
int[] text1 = rc4.PRNG(text);
for(int i=0;i<text1.length;i++){
System.out.println(text1[i]);
}
*/
System.out.println("length:"+pixels.length);
// l.write((int)512,(int)512,text);
// TODO code application logic here
}
}
//encryption
package test;
/**
*
* #author AMaR
*/
public class RC4New {
int state[] = new int[256];
int j;
/**
*
* #param key
*/
public void KSA(int[] key){
int tmp;
for (int i=0; i < 256; i++) {
state[i] = i;
}
j=0;
for (int i=0; i < 256; i++) {
j = (j + state[i] + key[i % key.length]) % 256;
tmp = state[i];
state[i] = state[j];
state[j] = tmp;
}
}
public int[] PRNG(int[] data){
int tmp,k;
int i=0;
j=0;
int[] cipherText = new int[data.length];
for(int x=0;x<data.length;x++){
i = (i + 1) % 256;
j = (j + state[i]) % 256;
tmp = state[i];
state[i] = state[j];
state[j] = tmp;
k = state[(state[i] + state[j]) % 256];
cipherText[x]= (data[x] ^ k);
}
return cipherText;
}
}
//loading/writing image
package test;
import java.awt.Dimension;
import java.awt.image.BufferedImage;
import java.awt.image.Raster;
import java.io.IOException;
import javax.imageio.ImageIO;
import java.io.File;
import java.awt.image.WritableRaster;
/**
*
* #author AMaR
*/
public class LoadImage {
BufferedImage image;
void load()throws Exception {
// FIle newfile = new File("lena.png)
image = ImageIO.read(getClass().getResourceAsStream("lena.png"));
}
public Dimension getImageSize() {
return new Dimension(image.getWidth(), image.getHeight());
}
public int[] getImagePixels() {
int [] dummy = null;
int wid, hgt;
// compute size of the array
wid = image.getWidth();
hgt = image.getHeight();
// start getting the pixels
Raster pixelData;
pixelData = image.getData();
return pixelData.getPixels(0, 0, wid, hgt, dummy);
}
#SuppressWarnings("empty-statement")
public void write(int width ,int height, int[] pixels,File outputfile) {
try {
// retrieve image
BufferedImage writeImage = new BufferedImage(512, 512, BufferedImage.TYPE_BYTE_GRAY);;
// File outputfile = new File("encrypted.png");
WritableRaster raster = (WritableRaster) writeImage.getData();
raster.setPixels(0,0,width,height,pixels);
writeImage.setData(raster);
ImageIO.write(writeImage, "png", outputfile);
} catch (IOException e) {
}
}
}
It's not clear which of the below is triggering your error. This
File EnImage = new File("encrypted.png");
will read from the current directory, which is not necessarily the same directory as that your jar file is in.
This
image = ImageIO.read(getClass().getResourceAsStream("lena.png"));
will read from the directory in the jar file that your class is in. Note that you're reading from the jar file, not the directory.
Given the above code, I would:
determine or explicitly specify the working directory for the File() operations. Your working directory is the one you invoke java from, and this may differ within/without the IDE
package the lena.png as a resource within your .jar file.