I am getting the int array from png image how I will convert this to bufferdimage or creating new PNG file ?
int[] pixel = new int[w1*h1];
int i = 0;
for (int xx = 0; xx < h1; xx++) {
for (int yy = 0; yy < w1; yy++) {
pixel[i] = img.getRGB(yy, xx);
i++;
}
}
If you have an array of integers which are packed RGB values, this is the java code to save it to a file:
int width = 100;
int height = 100;
int[] rgbs = buildRaster(width, height);
DataBuffer rgbData = new DataBufferInt(rgbs, rgbs.length);
WritableRaster raster = Raster.createPackedRaster(rgbData, width, height, width,
new int[]{0xff0000, 0xff00, 0xff},
null);
ColorModel colorModel = new DirectColorModel(24, 0xff0000, 0xff00, 0xff);
BufferedImage img = new BufferedImage(colorModel, raster, false, null);
String fname = "/tmp/whatI.png";
ImageIO.write(img, "png", new File(fname));
System.out.println("wrote to "+fname);
The reason for the arrays 0xff0000, 0xff00, 0xff is that the RGB bytes are packed with blue in the least significant byte. If you pack your ints different, alter that array.
You can rebuild the image manually, this is however a pretty expensive operation.
BufferedImage image = new BufferedImage(64, 64, BufferedImage.TYPE_INT_RGB);
Graphics g = image.getGraphics();
for(int i = 0; i < pixels.size(); i++)
{
g.setColor(new java.awt.Color(pixels.get(i).getRed(), pixels.get(i).getGreen(), pixels.get(i).getBlue()));
g.fillRect(pixels.get(i).getxPos(), pixels.get(i).getyPos(), 1, 1);
}
try
{
ImageIO.write(image, "PNG", new File("imageName.png"))
}
catch(IOException error)
{
error.printStackTrace();
}
I formatted your image array into an object, this is personal preference tho (of course you could us an int array with this model as well). Keep in mind that you can always add the alpha to there as well.
Try the ImageIO class, which can take a byte array representing pixel data to build an image object and then writing it out in a particular format.
try {
BufferedImage bufferedImage = ImageIO.read(new ByteArrayInputStream(yourBytes));
ImageIO.write(bufferedImage, "png", new File("out.png"));
} catch (IOException e) {
e.printStackTrace();
}
Related
I need to display a PNG image in a SWT Java window, I'm using WindowBuilder and Eclipse.
First I tried with a label and this code:
Label lblNewLabel = new Label(this, SWT.NONE);
lblNewLabel.setLayoutData(new GridData(SWT.CENTER, SWT.CENTER, true, true, 1, 1));
Image image = new Image(display, "img/selae_mini.png");
lblNewLabel.setImage(image)
It worked when executing in eclipse, but when I generate the jar, then, it doesn't work. Then I found on Stack Overflow that you must use ClassLoader.getSystemClassLoader().getResourceAsStream to get a bufferedImage and after that you must convert that bufferedImage to a ImageData, and finally convert that into a SWT Image.
So I tried with this code:
protected Image readImage(String path, Display display) {
InputStream stream = ClassLoader.getSystemClassLoader().getResourceAsStream(path);
BufferedImage bi = null;
try {
bi = ImageIO.read(stream);
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
if (stream != null)
stream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
return new Image(display, convertToSWT(bi));
}
public static ImageData convertToSWT(BufferedImage bufferedImage) {
if (bufferedImage.getColorModel() instanceof DirectColorModel) {
DirectColorModel colorModel = (DirectColorModel) bufferedImage.getColorModel();
PaletteData palette = new PaletteData(
colorModel.getRedMask(),
colorModel.getGreenMask(),
colorModel.getBlueMask()
);
ImageData data = new ImageData(
bufferedImage.getWidth(),
bufferedImage.getHeight(), colorModel.getPixelSize(),
palette
);
WritableRaster raster = bufferedImage.getRaster();
int[] pixelArray = new int[3];
for (int y = 0; y < data.height; y++) {
for (int x = 0; x < data.width; x++) {
raster.getPixel(x, y, pixelArray);
int pixel = palette.getPixel(
new RGB(pixelArray[0], pixelArray[1], pixelArray[2])
);
data.setPixel(x, y, pixel);
}
}
return data;
} else if (bufferedImage.getColorModel() instanceof IndexColorModel) {
IndexColorModel colorModel = (IndexColorModel) bufferedImage.getColorModel();
int size = colorModel.getMapSize();
byte[] reds = new byte[size];
byte[] greens = new byte[size];
byte[] blues = new byte[size];
colorModel.getReds(reds);
colorModel.getGreens(greens);
colorModel.getBlues(blues);
RGB[] rgbs = new RGB[size];
for (int i = 0; i < rgbs.length; i++) {
rgbs[i] = new RGB(reds[i] & 0xFF, greens[i] & 0xFF, blues[i] & 0xFF);
}
PaletteData palette = new PaletteData(rgbs);
ImageData data = new ImageData(
bufferedImage.getWidth(),
bufferedImage.getHeight(),
colorModel.getPixelSize(),
palette
);
data.transparentPixel = colorModel.getTransparentPixel();
WritableRaster raster = bufferedImage.getRaster();
int[] pixelArray = new int[1];
for (int y = 0; y < data.height; y++) {
for (int x = 0; x < data.width; x++) {
raster.getPixel(x, y, pixelArray);
data.setPixel(x, y, pixelArray[0]);
}
}
return data;
}
return null;
}
The problem now is that my Image is a PNG file, and when doing the IF of the convertToSWT method, it gets that the image has a ColorModel called #pixelBits, so it returns null on that method! and I can't find any info about how to solve this problem.
I had this problem in before, and i resolved that. assuming your image has located in resource folder of your project:
String yourImg = "sampleImg.png";
...
Label swtImg = new Label(composite, SWT.NONE);
swtImg.setImage(new Image(display, YourClassName.calss.getResourceAsStream(yourImg)));
it's worked for me!
good luck ;)
Im trying to save my bitmap file as to png to my Internal Storage
How can i do it?
My code to generate QR Code :-
public void createQRCode(String qrCodeData, String charset, Map hintMap, int qrCodeheight, int qrCodewidth) {
Bitmap bitmap = null;
try {
//generating qr code in bitmatrix type
BitMatrix matrix = new MultiFormatWriter().encode(new String(qrCodeData.getBytes(charset), charset), BarcodeFormat.QR_CODE, qrCodewidth, qrCodeheight, hintMap);
//converting bitmatrix to bitmap
int width = matrix.getWidth();
int height = matrix.getHeight();
int[] pixels = new int[width * height];
// All are 0, or black, by default
for (int y = 0; y < height; y++) {
int offset = y * width;
for (int x = 0; x < width; x++) {
pixels[offset + x] = matrix.get(x, y) ? BLACK : WHITE;
}
}
bitmap = Bitmap.createBitmap(width, height, Bitmap.Config.ARGB_8888);
bitmap.setPixels(pixels, 0, width, 0, 0, width, height);
qrImageView.setImageBitmap(bitmap);
} catch (Exception er) {
Log.e("QrGenerate", er.getMessage());
}
}
I want to save the generated QR code bitmap to my Internal Storage for example say in DCIM folder i want make a new folder "QR CODES" then i want to save my bitmap as "QRCode1.png"
How can i do this?
i have tried many tutorials and none of them seem to work for me or i am doing it wrong maybe
PS
i have already added the write permission to my AndroidManifest file
Try doing this and then use the data to save it in your local device.
saveImageBitmap(Bitmap bitmap){
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bitmap.compress(Bitmap.CompressFormat.JPEG, 20, baos);
byte[] data = baos.toByteArray();
//do whatever you want to save this bitmap file
}
Hope that'll help you out! This will transform the bitmap into .jpg format and then save it in your file.
I've got a byte array storing 16-bit pixel data from an already-deconstructed DICOM file. What I need to do now is convert/export that pixel data somehow into a TIFF file format. I'm using the imageio-tiff-3.3.2.jar plugin to handle the tiff conversion/header data. But now I need to pack that image data array into a BufferedImage of the original image dimensions so it can be exported to TIFF. But it seems that BufferedImage doesn't support 16-bit images. Is there a way around this problem, such as an external library? Is there another way I can pack that image data into a TIFF image of the original DICOM dimensions? Keep in mind, this process has to be completely lossless. I've looked around and tried out some things for the last few days, but so far nothing has worked for me.
Let me know if you have any questions or if there's anything I can do to clear up any confusion.
EDIT: Intended and Current image
Given your input data of a raw byte array, containing unsigned 16 bit image data, here's two ways to create a BufferedImage.
The first one will be slower, as it involves copying the byte array into a short array. It will also need twice the amount of memory. The upside is that it creates a standard TYPE_USHORT_GRAY BufferedImage, which may be faster to display and may be more compatible.
private static BufferedImage createCopyUsingByteBuffer(int w, int h, byte[] rawBytes) {
short[] rawShorts = new short[rawBytes.length / 2];
ByteBuffer.wrap(rawBytes)
// .order(ByteOrder.LITTLE_ENDIAN) // Depending on the data's endianness
.asShortBuffer()
.get(rawShorts);
DataBuffer dataBuffer = new DataBufferUShort(rawShorts, rawShorts.length);
int stride = 1;
WritableRaster raster = Raster.createInterleavedRaster(dataBuffer, w, h, w * stride, stride, new int[] {0}, null);
ColorModel colorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_GRAY), false, false, Transparency.OPAQUE, DataBuffer.TYPE_USHORT);
return new BufferedImage(colorModel, raster, colorModel.isAlphaPremultiplied(), null);
}
A variant that is much faster (previous version takes 4-5x more time) to create, but results in a TYPE_CUSTOM image, that might be slower to display (it does seem to perform reasonable though, in my tests). It's much faster, and uses very little extra memory, as it does no copying/conversion of the input data at creation time.
Instead, it uses a custom sample model, that has DataBuffer.TYPE_USHORT as transfer type, but uses DataBufferByte as data buffer.
private static BufferedImage createNoCopy(int w, int h, byte[] rawBytes) {
DataBuffer dataBuffer = new DataBufferByte(rawBytes, rawBytes.length);
int stride = 2;
SampleModel sampleModel = new MyComponentSampleModel(w, h, stride);
WritableRaster raster = Raster.createWritableRaster(sampleModel, dataBuffer, null);
ColorModel colorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_GRAY), false, false, Transparency.OPAQUE, DataBuffer.TYPE_USHORT);
return new BufferedImage(colorModel, raster, colorModel.isAlphaPremultiplied(), null);
}
private static class MyComponentSampleModel extends ComponentSampleModel {
public MyComponentSampleModel(int w, int h, int stride) {
super(DataBuffer.TYPE_USHORT, w, h, stride, w * stride, new int[] {0});
}
#Override
public Object getDataElements(int x, int y, Object obj, DataBuffer data) {
if ((x < 0) || (y < 0) || (x >= width) || (y >= height)) {
throw new ArrayIndexOutOfBoundsException("Coordinate out of bounds!");
}
// Simplified, as we only support TYPE_USHORT
int numDataElems = getNumDataElements();
int pixelOffset = y * scanlineStride + x * pixelStride;
short[] sdata;
if (obj == null) {
sdata = new short[numDataElems];
}
else {
sdata = (short[]) obj;
}
for (int i = 0; i < numDataElems; i++) {
sdata[i] = (short) (data.getElem(0, pixelOffset) << 8 | data.getElem(0, pixelOffset + 1));
// If little endian, swap the element order, like this:
// sdata[i] = (short) (data.getElem(0, pixelOffset + 1) << 8 | data.getElem(0, pixelOffset));
}
return sdata;
}
}
If your image looks strange after this conversion, try flipping the endianness, as commented in the code.
And finally, some code to exercise the above:
public static void main(String[] args) {
int w = 1760;
int h = 2140;
byte[] rawBytes = new byte[w * h * 2]; // This will be your input array, 7532800 bytes
ShortBuffer buffer = ByteBuffer.wrap(rawBytes)
// .order(ByteOrder.LITTLE_ENDIAN) // Try swapping the byte order to see sharp edges
.asShortBuffer();
// Let's make a simple gradient, from black UL to white BR
int max = 65535; // Unsigned short max value
for (int y = 0; y < h; y++) {
double v = max * y / (double) h;
for (int x = 0; x < w; x++) {
buffer.put((short) Math.round((v + max * x / (double) w) / 2.0));
}
}
final BufferedImage image = createNoCopy(w, h, rawBytes);
// final BufferedImage image = createCopyUsingByteBuffer(w, h, rawBytes);
SwingUtilities.invokeLater(new Runnable() {
#Override
public void run() {
JFrame frame = new JFrame("Test");
frame.setDefaultCloseOperation(WindowConstants.EXIT_ON_CLOSE);
frame.add(new JScrollPane(new JLabel(new ImageIcon(image))));
frame.pack();
frame.setLocationRelativeTo(null);
frame.setVisible(true);
}
});
}
Here's what the output should look like (scaled down to 1/10th):
The easiest thing to do is to create a BufferedImage of type TYPE_USHORT_GRAY, which is type to use for 16 bits encoding.
public BufferedImage Convert(short[] array, final int width, final int height)
{
BufferedImage image = new BufferedImage(width, height, BufferedImage.TYPE_USHORT_GRAY) ;
short[] sb = ((DataBufferUShort) image.getRaster().getDataBuffer()).getData() ;
System.arraycopy(array, 0, sb, 0, array.length) ;
return image ;
}
Then you can use Java.imageio to save your image as a TIFF or a PNG. I think that the Twelve Monkey Project allows a better TIFF support for imageio, but you have to check first.
[EDIT] In your case because you deal with huge DICOM images that cannot be stored into a regular BufferedImage, you have to create your own type using the Unsafe class to allocated the DataBuffer.
Create a new class DataBufferLongShort that will allocate the needed array/DataBuffer using the Unsafe class. Then you can use Long indexes instead of Integer
Create a new class DataBuffer that extends the classical DataBuffer in order to add a type TYPE_LONG_USHORT
Then you can create the ColorModel with the new DataBuffer.
I have a .bin file that has been created in a MATLAB code as uint16 and i need to read it in Java.
With the code below, I get a blurry image with a very bad grayscale and the length of the file seems to be the double of the amount of pixels. There seems to be some loss of information when reading the file this way. Is there a way to read .bin files other than inputstreams?
This is how I try to read the .bin file:
is = new FileInputStream(filename);
dis = new DataInputStream(is);
int[] buf = new int[length];
int[][] real = new int[x][y];
while (dis.available() > 0) {
buf[i] = dis.readShort();
}
int counter = 0;
for (int j = 0; j < x; j++) {
for (int k = 0; k < y; k++) {
real[j][k] = buf[counter];
counter++;
}
}
return real;
And this is from the part from the main class where the first class is called:
BinaryFile2 binary = new BinaryFile2();
int[][] image = binary.read("data001.bin", 1024, 2048);
BufferedImage theImage = new BufferedImage(1024, 2048,
BufferedImage.TYPE_BYTE_GRAY);
for (int y = 0; y < 2048; y++) {
for (int x = 0; x < 1024; x++) {
int value = image[x][y];
theImage.setRGB(x, y, value);
}
}
File outputfile = new File("saved.png");
ImageIO.write(theImage, "png", outputfile);
You are storing uint16 data in an int array, this may lead to loss/corruption of data.
Following post discusses similar issue:
Java read unsigned int, store, and write it back
To correctly read and display an image originally stored as uint16, it's best to use the BufferedImage.TYPE_USHORT_GRAY type. A Java short is 16 bit, and the DataBufferUShort is made for storing unsigned 16 bit values.
Try this:
InputStream is = ...;
DataInputStream data = new DataInputStream(is);
BufferedImage theImage = new BufferedImage(1024, 2048, BufferedImage.TYPE_USHORT_GRAY);
short[] pixels = ((DataBufferUShort) theImage.getRaster().getDataBuffer()).getData();
for (int i = 0; i < pixels.length; i++) {
pixels[i] = data.readShort(); // short value is signed, but DataBufferUShort will take care of the "unsigning"
}
// TODO: close() streams in a finally block
To convert the image further to an 8 bit image, you can create a new image and draw the original onto that:
BufferedImage otherImage = new BufferedImage(1024, 2048, BufferedImage.TYPE_BYTE_GRAY);
Graphics2D g = otherImage.createGraphics();
try {
g.drawImage(theImage, 0, 0, null);
}
finally {
g.dispose();
}
Now you can store otherImage as an eight bit grayscale PNG.
I am reading a .jpg image and accesing the pixels as:
if (type == BufferedImage.TYPE_3BYTE_BGR) {
System.out.println("type.3byte.bgr");
byte[] pixels = (byte[]) sourceImage.getData().getDataElements(0, 0, w, h, null);
}
// process this array called pixels and display the resulting image
// first i convert it to integer
int offseet=0;
int[] data=new int[width*height*3];
for ( i = 0; i < data.length; i++) {
data[i] = pixels[offset++] & 0xff;
}
// and then process this array. For now I am not processing it. Now when i create a buffered
//image from data array and display it the image displayed is not the same.
// the code i use is
writeEdges(data);
private BufferedImage edgesImage;
private void writeEdges(int arb[]) {
if (edgesImage == null) {
edgesImage = new BufferedImage(w, h, BufferedImage.TYPE_INT_RGB);
}
edgesImage.getWritableTile(0, 0).setDataElements(0, 0, width,height, arb);
}
// for the penguins.jpg image(provided in sample pictures in windows 7)!
the output i get is
I do not know how is your code running because it would not compile as jcomeau_ictx pointed out. The problem is that you use different image type for reading and writing.
Here is a code snippet that would generate the same image as the input.
public static void main(String[] args) {
BufferedImage sourceImage = null;
try {
sourceImage = ImageIO.read(new File("IScSH.jpg"));
} catch (IOException e) {
}
int type = sourceImage.getType();
int w = sourceImage.getWidth();
int h = sourceImage.getHeight();
byte[] pixels = null;
if (type == BufferedImage.TYPE_3BYTE_BGR) {
System.out.println("type.3byte.bgr");
pixels = (byte[]) sourceImage.getData().getDataElements(0, 0, w, h, null);
}
try {
BufferedImage edgesImage = new BufferedImage(w, h, BufferedImage.TYPE_3BYTE_BGR);
edgesImage.getWritableTile(0, 0).setDataElements(0, 0, w, h, pixels);
ImageIO.write(edgesImage, "jpg", new File("nvRhP.jpg"));
} catch (IOException e) {
}
}