public static void main(String[] args) {
String finalHex = "";
String input = "Hello There Sir.";
int pixelX = -1;
int pixelY = 0;
try{
BufferedImage bi = new BufferedImage(64, 64, BufferedImage.TYPE_INT_ARGB);
File out = new File("saved.png");
if(out.exists()==false){
ImageIO.write(bi, "png", out);
System.out.println("PNG WAS CREATED");
}else
System.out.println("ERROR: PNG WAS ALREADY THERE");
for (int i = 0;i < input.length(); i++){
char result = input.charAt(i);
int ascii = (int) result;
String num = Integer.toHexString(ascii).toUpperCase();
if(finalHex.length()==6){
System.out.println(finalHex);
pixelX += 1;
finalHex=("#"+finalHex);
Color c = Color.decode(finalHex);
int rgb = c.getRGB();
System.out.println(rgb);
if(pixelX==63){
pixelX=0;
pixelY+=1;
}
bi.setRGB(pixelX, pixelY, rgb);
finalHex="";
}
finalHex+=num;
}
}catch(IOException e){
System.out.println("ERROR: WELP... SOMETHING SCREWED UP.");
}
}
I am trying to use this to convert text into a png image but I cant get it to write to a png file. I am not that experienced in this area so if anyone could help me out i would be very much appreciated. :)
you should add ImageIO.write(bi, "png", out); after the end of for(int i = 0;i < input.length(); i++){...} this program will write some colored pixels is that what you want??
example:
result picture
Related
i only got the image data with NO header informations but i know several things like:
16 bit grayscale data
1200x1200 (although its 1200x900 but its likely to have a "bar" at the buttom)
the data are 2880000 bytes in size which fits 1200x1200 x 2bytes ->short
here is the raw image data
for visualizing i use this:
public static void saveImage(short[] pix, int width, int height, File outputfile) {
ColorSpace cs = ColorSpace.getInstance(ColorSpace.CS_GRAY);
int[] nBits = {16};
ComponentColorModel cm = new ComponentColorModel(cs, nBits,false, false, Transparency.OPAQUE, DataBuffer.TYPE_USHORT);
SampleModel sm = cm.createCompatibleSampleModel(width, height);
DataBufferShort db = new DataBufferShort(pix, width*height);
WritableRaster raster = Raster.createWritableRaster(sm, db, null);
BufferedImage bf = new BufferedImage(cm, raster, false, null);
if(outputfile!=null)
try {
if(!ImageIO.write(bf, "png", outputfile)) System.out.println("No writer found.");
System.out.println("Saved: "+outputfile.getAbsolutePath());
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
else System.out.println("error");
}
The data are read like this (only experimental/bad code, its only for testing):
for(int tt=1; tt<20; tt++) {
pix = new short[1200*1200];
i = 0;
int z = 0;
int line = 0;
//loop
while(i<(1200*1200)) {
pix[i++] = buffer.getShort(z);
z += tt;
if(z>=(len-1)) {
line += 2;
z = line;
if(z>=(len-1)) {
System.out.println("break at "+z);
break;
}
System.out.println("test "+line);
}
}
System.out.println("img_"+imgcount+".png "+pix.length);
saveImage(pix, 1200, 1200, new File("img_"+imgcount+"_"+tt+".png"));
}
Where i can see something for tt=4,8,16 (images get multiplied) but i cant realy get the whole picture.image tt=8 image tt=16
Its like the solution is in front of me but i cant see it xD
Can someone help me with the algorithm/format the image is stored?
EDIT: Reading data consecutively with:
short[] pix = new short[1200*1200];
int i = 0;
while(i< (1200*1200) && buffer.remaining()>0) {
pix[i++] = buffer.getShort();
}
results in: noisy picture
EDIT 2:
Ok looks like its base64 encoded which makes sense due its stored in a xml file
I finaly solved it, its base64 encoded and little endian (thanks RealSkeptic for hinting to try little/big endian).
I built a little java program that hides messages in an image using the least significant bit method. It works fine when inputting a jpg file. The output may be png or jpg. When inputting a png though, the result looks very stange.
Here are the original and the result images respectively:
public abstract class Builder{
public static void leastSignificantBitEncryption(String imageSource, String message, String newPath) {
BufferedImage image = returnImage(imageSource);
//prepare variables
String[] messageBinString = null;
String[] pixelBinString = null;
final byte[] messageBin = message.getBytes(StandardCharsets.UTF_8);
final byte[] pixelsBin = ((DataBufferByte) image.getRaster().getDataBuffer()).getData();
//convert message and image to binary string array
try {
messageBinString = stringToBinaryStrings(messageBin);
pixelBinString = stringToBinaryStrings(pixelsBin);
} catch (UnsupportedEncodingException e) {
e.printStackTrace();
}
String[] messageBinStringCut = splitIn2Bit(messageBinString); //split message binary into 2 bit strings
String[] pixelBinStringNew = pixelBinString.clone(); //insert 2 bit strings in last 2 bits of bytes from bitmap
insert2Bit(messageBinStringCut, pixelBinStringNew);
byte[] pixelsBinNew = stringArrayToByteArray(pixelBinStringNew); //Convert string array to byte array
try { //Create new image out of bitmap
int w = image.getWidth();
int h = image.getHeight();
BufferedImage imageNew = new BufferedImage(w, h, BufferedImage.TYPE_3BYTE_BGR);
imageNew.setData(Raster.createRaster(imageNew.getSampleModel(), new DataBufferByte(pixelsBinNew, pixelsBinNew.length), new Point()));
File imageFile = new File(newPath);
ImageIO.write(imageNew, "png", imageFile);
} catch (IOException e) {
e.printStackTrace();
}
}
private static String[] stringToBinaryStrings(byte[] messageBin) throws UnsupportedEncodingException{
String[] bytes = new String[messageBin.length];
int i = 0;
for(byte b : messageBin) {
bytes[i] = String.format("%8s", Integer.toBinaryString(b & 0xFF)).replace(' ', '0');
i++;
}
return bytes;
}
private static String binaryStringsToString(String[] messageBin) throws UnsupportedEncodingException{
StringBuilder stringBuilder = new StringBuilder();
int i = 0;
while(messageBin[i] != null) {
stringBuilder.append((char) Integer.parseInt(messageBin[i], 2));
i++;
}
return stringBuilder.toString();
}
private static BufferedImage returnImage(String imageSource) {
try{
try {
return ImageIO.read(new URL(imageSource));
} catch (MalformedURLException e) {
return ImageIO.read(new File(imageSource));
}
} catch (IOException ioe) {
ioe.printStackTrace();
return null;
}
}
private static byte[] stringArrayToByteArray(String[] stringArray) {
byte[] byteArray = new byte[stringArray.length];
for(int i = 0; i < stringArray.length; i++) {
byteArray[i] = (byte) Integer.parseInt(stringArray[i], 2);
}
return byteArray;
}
private static String[] splitIn2Bit(String[] inputArray) {
String[] outputArray = new String[inputArray.length * 4];
for(int i = 0; i < outputArray.length; i += 4) {
String[] splitByte = inputArray[i / 4].split("(?<=\\G..)");
outputArray[i] = splitByte[0];
outputArray[i + 1] = splitByte[1];
outputArray[i + 2] = splitByte[2];
outputArray[i + 3] = splitByte[3];
}
return outputArray;
}
private static String[] insert2Bit(String[] twoBitArray, String[] insertArray) {
for(int i = 0; i < twoBitArray.length; i++) {
insertArray[i] = insertArray[i].substring(0, 6) + twoBitArray[i];
}
return insertArray;
}
}
Also, the testclass:
public class Test {
public static void main(String[] args) {
Builder.leastSignificantBitEncryption("IMAGEPATH OR URL", "MESSAGE", "PATH FOR IMAGE CONTAINING MESSAGE");
Builder.leastSignificantBitDecryption("PATH OF IMAGE CONTAINING MESSAGE", "PATH FOR TXT CONTAINING OUTPUT");
}
}
The error originates from the fact that the png image has an extra channel for transparency. System.out.println(pixelsBin.length); returns 338355 bytes for the jpg and 451140 bytes for the png.
The simplest solution would be to create the appropriate imageNew depending on the format file. For example,
int w = image.getWidth();
int h = image.getHeight();
BufferedImage imageNew = null;
if (imageSource.matches(".*jpg$")) {
imageNew = new BufferedImage(w, h, BufferedImage.TYPE_3BYTE_BGR);
} else if (imageSource.matches(".*png$")) {
imageNew = new BufferedImage(w, h, BufferedImage.TYPE_4BYTE_ABGR);
} else {
// whatever
}
imageNew.setData(Raster.createRaster(imageNew.getSampleModel(), new DataBufferByte(pixelsBinNew, pixelsBinNew.length), new Point()));
However, you have to be aware that the message is not embedded in both types in the same pixels. The byte array of a 3-channel image (no transparency) goes like this
first-pixel-BLUE, first-pixel-GREEN, first-pixel-RED, second-pixel-BLUE, etc
while for a 4-channel image
first-pixel-ALPHA, first-pixel-BLUE, first-pixel-GREEN, first-pixel-RED, second-pixel-ALPHA, etc
If you care about that detail, you might be interested in removing the alpha channel from the png first, so you're always working with 3-channel images.
I was using openCv to apply filters and ran into a problem.
What I did in this specific code was first try and get a grayscale image but preserve the color channels it works fine but when I write to file using Image IO I find the alpha has been altered. So I checked for BGRA and ABGR colorspaces but it still does not work and gives me a transparent image.
public static BufferedImage sepia(BufferedImage image,int intensity)
{
Mat imageMat = bufferedToMat(image);
int sepiaDepth = 20;
int width = image.getWidth();
int height = image.getHeight();
Mat grayScaleMat = new Mat(imageMat.height(),imageMat.width(),CvType.CV_8UC4);
imageMat.copyTo(grayScaleMat);
// double[] test = imageMat.get(0, 0);
// System.out.println(test[0]+" "+test[1]+" "+test[2]+" "+test[3]);
for(int i=0;i<grayScaleMat.cols();i++)
{
for(int j=0;j<grayScaleMat.rows();j++)
{
//can be optimised
double[] data = grayScaleMat.get(j, i);
//System.out.println(data.length);
double blue = data[0];
double green = data[1];
double red = data[2];
//System.out.println(red+" "+blue+" "+green);
double gray = (red + blue + green)/3.0;
//data[0] = gray;
data[0] = gray;
data[1] = gray;
data[2] = gray;
grayScaleMat.put(j, i, data);
}
}
return (Utility.matToBuffered(grayScaleMat));
}
//Only Testing Remove Later
public static void main(String[] args)
{
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
int beta = 25;
String imagePath = "/home/arjun/Pictures/Lenna.png";
BufferedImage image = null;
try{
image = ImageIO.read(new File(imagePath));
}catch(IOException e)
{
e.printStackTrace();
}
int x = image.getType();
System.out.println(x);
BufferedImage output = sepia(image,beta);
int y = output.getType();
System.out.println(y);
File outputfile = new File("/home/arjun/Pictures/sepia2.png");
try{
ImageIO.write(output, "png", outputfile);
}catch(IOException e)
{
e.printStackTrace();
}
}
And the Buffered and Mat conversions here
public static Mat bufferedToMat(BufferedImage image)
{
byte[] pixels = ((DataBufferByte)image.getRaster().getDataBuffer()).getData();
Mat imageMat = new Mat(image.getHeight(),image.getWidth(),CvType.CV_8UC4);
imageMat.put(0, 0, pixels);
return imageMat;
}
public static BufferedImage matToBuffered(Mat imageMat)
{
BufferedImage out;
byte[] data = new byte[imageMat.cols()*imageMat.rows()*(int)imageMat.elemSize()];
imageMat.get(0, 0,data);
int type = BufferedImage.TYPE_3BYTE_BGR;
if(imageMat.channels() == 1)
{
type = BufferedImage.TYPE_BYTE_GRAY;
}
else if(imageMat.channels() == 3)
{
type = BufferedImage.TYPE_3BYTE_BGR;
}
else if(imageMat.channels() == 4)
{
type = BufferedImage.TYPE_4BYTE_ABGR;
}
out = new BufferedImage(imageMat.cols(),imageMat.rows(),type);
out.getRaster().setDataElements(0,0,imageMat.cols(),imageMat.rows(),data);
return out;
}
Input Image:
Output Image:
import javax.imageio.ImageIO;
import org.bytedeco.javacv.FFmpegFrameGrabber;
public class FrameData
{
int count = 0;
int picWidth;
int picHeight;
BufferedImage img = null;
//GET FRAME COUNT
public int gf_count(int numofFrames, BufferedImage[] frameArray, String fileLocationsent, String videoNamesent) throws IOException
{
String fileLocation = fileLocationsent;
String videoName = videoNamesent;
int frameNums = numofFrames;
int totFrames = 0;
FFmpegFrameGrabber grab = new FFmpegFrameGrabber(fileLocation + videoName);
try { grab.start(); }
catch (Exception e) { System.out.println("Unable to grab frames"); }
for(int i = 0 ; i < frameNums ; i++)
{
try
{
frameArray[i]= grab.grab().getBufferedImage();
totFrames = i;
File outputfile = new File(fileLocation + "GrayScaledImage" + i + ".jpg");
ImageIO.write(frameArray[i], "jpg", outputfile);
}
catch (Exception e) { /*e.printStackTrace();*/ }
}//END for
return totFrames;
}//END METHOD long getFrameCount()
Hope someone can explain this to me...
I am just learning java so here goes...
I wrote this code to count the number of frames in a .mov file and to test my buffered image array I generated files of the images. As the code is, it works as planned... The problem is immediately after the capturing, if I send the bufferedimages out as files, they all seem to be just the first image. see example below...
for(int i = 0 ; i < frameNums ; i++)
{
try
{
frameArray[i]= grab.grab().getBufferedImage();
totFrames = i;
File outputfile = new File(fileLocation + "GrayScaledImage" + i + ".jpg");
ImageIO.write(frameArray[i], "jpg", outputfile);
}
catch (Exception e) { /*e.printStackTrace();*/ }
}//END for
And now if I change that to...
for(int i = 0 ; i < frameNums ; i++)
{
try
{
frameArray[i]= grab.grab().getBufferedImage();
totFrames = i; catch (Exception e) { /*e.printStackTrace();*/ }}
for(int j = 0; j < frameNums; j++)
{
File outputfile = new File(fileLocation + "GrayScaledImage" + j + ".jpg");
ImageIO.write(frameArray[j], "jpg", outputfile);
}
I don't understand why I am getting the same image repeatedly.
If further information Is required, just lemme know, this is my first programming question online... Usually find what I am looking for that others have asked. Couldn't find this one.
Thanks for your time
Ken
The problem is that the grab().getBufferedImage() does its work in the same buffer every time. When you assign a reference to that buffer in your loop, you are assigning a reference to the same buffer numofFrames times. What you are writing then is not the first frame, but the last frame. In order to fix this you need to do a "deep copy" of the BufferedImage. See code below:
public class FrameData {
BufferedImage img;
Graphics2D g2;
// GET FRAME COUNT
public int gf_count(int numFrames, BufferedImage[] frameArray, String fileLocation, String videoName) throws Exception, IOException {
Java2DFrameConverter converter = new Java2DFrameConverter();
int totFrames = 0;
img = new BufferedImage(100, 50, BufferedImage.TYPE_INT_ARGB);
g2 = img.createGraphics();
FFmpegFrameGrabber grab = new FFmpegFrameGrabber(fileLocation + videoName);
grab.start();
for (int i = 0; i < numFrames; i++) {
frameArray[i] = deepCopy(converter.convert(grab.grab()));
totFrames = i;
}
for (int j = 0; j < totFrames; j++) {
File outputfile = new File(fileLocation + "TImage" + j + ".jpg");
ImageIO.write(frameArray[j], "jpg", outputfile);
}
return totFrames;
}// END METHOD long getFrameCount()
BufferedImage deepCopy(BufferedImage bi) {
ColorModel cm = bi.getColorModel();
boolean isAlphaPremultiplied = cm.isAlphaPremultiplied();
WritableRaster raster = bi.copyData(null);
return new BufferedImage(cm, raster, isAlphaPremultiplied, null);
}
// This does what the converter.convert seems to do, which
// is decode an image into the same place over and over.
// if you don't copy the result every time, then you end up
// with an array of references to the same last frame.
BufferedImage draw() {
g2.setColor(new Color(-1));
g2.fillRect(0, 0, 100, 50);
g2.setColor(new Color(0));
g2.drawLine(0, 0, (int)(Math.random()*100.0), (int)(Math.random()*50.0));
return img;
}
public static void main(String... args) throws Exception, IOException {
new FrameData().run();
}
private void run() throws Exception, IOException {
BufferedImage images[] = new BufferedImage[50];
gf_count(50, images, "C:/Users/karl/Videos/", "dance.mpg");
}
}
I have included a draw() method that shows by example how work is done in the same BufferedImage repeatedly, in case you want to replicate the problem.
There are certainly other ways to do a deep copy and there may be issues with the one shown. Reference: How do you clone a BufferedImage.
PS> I updated the code to use the 1.1 version of the bytedeco library.
I am trying to loop through a file of png images and stitch them together to form one image. I have gotten it to work without using a for loop and just combining two images from the file. So I am assuming it is the looping that isn't working. The images are located in a file called "images".
I would appreciate your input!
File file = new File("images");
File [] moreFile = file.listFiles();
String [] images = new String [moreFile.length];
for(int i =0; i <images.length; i++)
{images[i] = moreFile[i].getName();
}
int x = 0;
int y = 0;
BufferedImage result = new BufferedImage(
controller.getSize().width*2, controller.getSize().height, //work these out
BufferedImage.TYPE_INT_RGB);
Graphics g = result.getGraphics();
for(String image : images){
System.out.println(image);
File path = new File("images");
try{ImageIO.read(new File(path, image));
System.out.println("it definitely goes here");
BufferedImage bi = new BufferedImage((2*510),(2*439),BufferedImage.TYPE_INT_ARGB);
bi =ImageIO.read(new File(path, image));
g.drawImage(bi, x, y, null);
g.dispose();
/// System.out.println(dafuck);
x += bi.getWidth();
if(x > result.getWidth()){
x = 0;
y += bi.getHeight();
}
}catch (Exception e) {
System.out.println("is it an exception>?");
}
}
try {
ImageIO.write(result,"png",new File("results"));
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}