How to set DPI information in a grayscale image? - java

First of all, hello and thanks for the opportunity.
I have a solution that scans (via native twain source) an image (or many images) and saves them into a folder in the file system.
My question is: I made some tests, and I always got an 96 image DPI (I was using ImageIO.write to save the images that came from the twain API in a BufferedImage object).
Than, I saw the answer of Peter Kofler in stackoverflow at this link (How to set DPI information in an image?) and it works in Colored and Black & White situations (set up from TwainCapability object).
But, for my scanned grayscale images, the DPI doesn't change anyway !!!
I'm verifying it using MS Paint -> Properties. It's always 96 DPI when I scan a grayscale image.
Any idea how I can set the DPI in this case??
I'm setting the DPI, like the Peter Kofler example as this:
resolutionState = 100;
//or resolutionState = 200;
//or resolutionState = 300;
double dotsPerMilli = resolutionState / 10 / 2.54;

thanks one more time in advance.
I found an answer and one way to do this !!!
In the following link: How to change the DPI from 96 to 300 of an image in java after resizing? the "user3603284"posted an solution that helped me doing this.
I changed from png to jpeg (it does not matter for the project specifications) and then, worked like a charm !!!
The code:
File imageFile = new File("C:/ScannerOutput/scannerImage" + System.currentTimeMillis() +".jpeg");
FileOutputStream fos = new FileOutputStream(imageFile);
JPEGImageEncoder jpegEncoder = JPEGCodec.createJPEGEncoder(fos);
JPEGEncodeParam jpegEncodeParam = jpegEncoder.getDefaultJPEGEncodeParam(image);
jpegEncodeParam.setDensityUnit(JPEGEncodeParam.DENSITY_UNIT_DOTS_INCH);
jpegEncoder.setJPEGEncodeParam(jpegEncodeParam);
jpegEncodeParam.setQuality(0.75f, false);
jpegEncodeParam.setXDensity(resolutionState); //DPI rate 100, 200 or 300
jpegEncodeParam.setYDensity(resolutionState); //DPI rate 100, 200 or 300
jpegEncoder.encode(image, jpegEncodeParam);
image.flush();
fos.close();
Thanks very much SO, always helping me =)

Related

In Java converting an image to sRGB makes the image too bright

I have multiple images with a custom profile embedded in them and want to convert the image to sRGB in order to serve it up to a browser. I have seen code like the following:
BufferedImage image = ImageIO.read(fileIn);
ColorSpace ics = ColorSpace.getInstance(ColorSpace.CS_sRGB);
ColorConvertOp cco = new ColorConvertOp(ics, null);
BufferedImage result = cco.filter(image, null);
ImageIO.write(result, "PNG", fileOut);
where fileIn and fileOut are File objects representing the input file and output file respectively. This works to an extent. The problem is that the resulting image is lighter than the original. If I was to convert the color space in photoshop the colors would appear the same. In fact if I pull up both images with photoshop and take a screen shot and sample the colors, they are the same. What is photoshop doing that the code above isn't and what can I do to correct the problem?
There are various types of images being converted, including JPEG, PNG, and TIFF. I have tried using TwelveMonkeys to read in JPEG and TIFF images and I still get the same effect, where the image is too light. The conversion process seems worst when applied to an image that didn't have an embedded profile in the first place.
Edit:
I've added some sample images to help explain the problem.
This image is the one with the color profile embedded in it. Viewed on some browsers there won't be a noticeable difference between this one and the next but viewed in Chrome on Mac OSX and Windows it currently appears darker than it should. This is where my problem originates in the first place. I need to convert the image to something that will show up correctly in Chrome.
This is an image converted with ImageMagick to the Adobe RGB 1998 color profile, which Chrome appears to be able to display correctly.
This is the image that I converted using the code above and it appears lighter than it should.
(Note that the images above are on imgur so to make them larger, simply remove the "t" from the end of the filename, before the file extension.)
This was my initial solution which worked but I didn't like having to use ImageMagick. I have created another answer based off of the solution I ended up sticking with.
I gave in and ended up using im4java, which is a wrapper around the command line tool of image magick. When I use the following code to get a BufferedImage, it works really well.
IMOperation op = new IMOperation();
op.addImage(fileIn.getAbsolutePath());
op.profile(colorFileIn.getAbsolutePath());
op.addImage("png:-");
ConvertCmd cmd = new ConvertCmd();
Stream2BufferedImage s2b = new Stream2BufferedImage();
cmd.setOutputConsumer(s2b);
cmd.run(op);
BufferedImage image = s2b.getImage();
I can also use the library to apply a CMYK profile for print when needed. It would be nice if ColorConvertOp did the conversion correctly but for now, at least, this is my solution. In order to reach parity with my question the im4java code to achieve the same effect in the question is:
ConvertCmd cmd = new ConvertCmd();
IMOperation op = new IMOperation();
op.addImage(fileIn.getAbsolutePath());
op.profile(colorFileIn.getAbsolutePath());
op.addImage(fileOut.getAbsolutePath());
cmd.run(op);
where colorFileIn.getAboslutePath() is the location of the sRGB color profile on the machine. Since im4java is using the command line it's not as straight forward how to perform operations but the library is explained in detail here. I originally had issues with image magick not working on my Mac as explained in the question. I installed it using brew but it turns out on a Mac you have to install it like brew install imagemagick --with-little-cms. After that image magick worked fine for me.
I found a solution that doesn't require ImageMagick. Basically Java doesn't respect the profile when loading the image so if there is one it needs to get loaded. Here is a code snippet of what I did to accomplish this:
private BufferedImage loadBufferedImage(InputStream inputStream) throws IOException, BadElementException {
byte[] imageBytes = IOUtils.toByteArray(inputStream);
BufferedImage incorrectImage = ImageIO.read(new ByteArrayInputStream(imageBytes));
if (incorrectImage.getColorModel() instanceof ComponentColorModel) {
// Java does not respect the color profile embedded in a component based image, so if there is a color
// profile, detected using iText, then create a buffered image with the correct profile.
Image iTextImage = Image.getInstance(imageBytes);
com.itextpdf.text.pdf.ICC_Profile iTextProfile = iTextImage.getICCProfile();
if (iTextProfile == null) {
// If no profile is present than the image should be processed as is.
return incorrectImage;
} else {
// If there is a profile present then create a buffered image with the profile embedded.
byte[] profileData = iTextProfile.getData();
ICC_Profile profile = ICC_Profile.getInstance(profileData);
ICC_ColorSpace ics = new ICC_ColorSpace(profile);
boolean hasAlpha = incorrectImage.getColorModel().hasAlpha();
boolean isAlphaPremultiplied = incorrectImage.isAlphaPremultiplied();
int transparency = incorrectImage.getTransparency();
int transferType = DataBuffer.TYPE_BYTE;
ComponentColorModel ccm = new ComponentColorModel(ics, hasAlpha, isAlphaPremultiplied, transparency, transferType);
return new BufferedImage(ccm, incorrectImage.copyData(null), isAlphaPremultiplied, null);
}
}
else if (incorrectImage.getColorModel() instanceof IndexColorModel) {
return incorrectImage;
}
else {
throw new UnsupportedEncodingException("Unsupported color model type.");
}
}
This answer does use iText, which is generally used for PDF creation and manipulation, but it happens to process the ICC profiles correctly and I'm already depending on it for my project so it happens to be a much better choice than ImageMagick.
The code in the question then ends up as follows:
BufferedImage image = loadBufferedImage(new FileInputStream(fileIn));
ColorSpace ics = ColorSpace.getInstance(ColorSpace.CS_sRGB);
ColorConvertOp cco = new ColorConvertOp(ics, null);
BufferedImage result = cco.filter(image, null);
ImageIO.write(result, "PNG", fileOut);
which works great.

Java: Slow image scale

Im using this code in java:
Image img = ImageIO.read(new File("imagepath/file.png").getScaledInstance(300, 300, BufferedImage.SCALE_SMOOTH);
BufferedImage buffered = new BufferedImage(300, 300, BufferedImage.SCALE_FAST);
buffered.getGraphics().drawImage(img, 0, 0 , null);
ByteArrayOutputStream os = new ByteArrayOutputStream();
ImageIO.write(buffered, "png", os);
InputStream in = new ByteArrayInputStream(os.toByteArray());
return in;
This successfully scales down and shows a thumbnail in the browser using my laptop. However when I'm launch it on my mini server (Raspberry Pi) it is horrible slow. More accurate is about 4 times longer than loading the actual full-res image.
Can anybody tell me how this is even possible? 300x300 < 1280x720! Should be less work and less bandwidth!
Cheers!
getScaledInstance is known to be slow, see for example this article for a detailed explanation.
Note that your
BufferedImage buffered = new BufferedImage(300, 300, BufferedImage.SCALE_FAST);
line is wrong, here for the third argument you should specify the image type ( TYPE_INT_RGB, TYPE_INT_ARGB, TYPE_INT_ARGB_PRE etc) and not SCALE_FAST (which is not even a field in BufferedImage)
Also see this: How to scale a BufferedImage
For quality downscaling see this: Quality of Image after resize very low -- Java

Update Tiff-Metadata using Apache Commons-Imaging/Sanselan

I will modify and add Tiff-Tags to existing tif-files with java. JAI imageio crashed, because it could not deal with certain tags from Tiff 6.0. Apache Commons-Imaging seems to be able to deal with these tags. But I have no idea, how to do that. I found a post here, I used for beginning (How to embed ICC_Profile in TiffOutputSet).
Using the example code creates an image, which I can't open because of an LZW error. If I use the Imaging.writeImage(...) methods, It changes the color model from 8Bit to 24Bit and the Exif metadata hase gone.
What i have done is:
bufferedImage = Imaging.getBufferedImage(srcTiff);
byte[] imageBytes = Imaging.writeImageToBytes(tifFile, imageFormat, optional_params)
exifDirectory = tiffOutputSet.getOrCreateRootDirectory();
...
TiffImageWriterLossLess lossLessWriter = new TiffImageWriterLossless(imageBytes);
os = new FileOutputStream(tmpFile);
os = new BufferedOutputStream(os);
lossLessWriter.writeImage(bufferedImage, os, image_params);
Playing around with image_params, like compression or defining the outputset as params, results in different issues. But one is constant, the destImage is bigger then the src image, even when the source image is 24 bit like the dest image.
How could I get Commons-Imaging work for me?
I can respond to the destImage bigger than the src, it is because TIFF images have a compression that is not carried over when the image is read into memory. On writing the image back to storage, you must apply the compression explicitly.

ZXing library can't decode Datamatrix barcode

I'm trying to use ZXing library to decode Datamatrix barcode. Here are my code sample:
BufferedImage bi = img.getBufferedImage();
Hashtable<DecodeHintType, Object> hints = new Hashtable<DecodeHintType, Object>();
hints.put(DecodeHintType.TRY_HARDER, Boolean.TRUE);
LuminanceSource source = new BufferedImageLuminanceSource(bi);
BinaryBitmap bitmap = new BinaryBitmap(new HybridBinarizer(source));
DataMatrixReader dataMatrixReader = new DataMatrixReader();
try {
Result res = dataMatrixReader.decode(bitmap,hints);
System.out.println("resultText = "+res.getText());
} catch (Exception e) {
System.out.println("failed to get resultText");
e.printStackTrace();
}
I've seen almost the same samples many times accross https://stackoverflow.com/ and other sites, but this approach does not working for me in this form.
As a source I'm using images grabbed from IR-camera. Here are example image:
As you see, the barcode is almost exactly at the center of an image, as Sean Owen recommended here and here. If I programmatically convert this image to black&white and crop image to bound barcode with some white space around it only, then ZXing works perfectly with images like this. But the problem is that barcode in real could have little deformations, so my simple algorythm can't help me to crop image properly. More over barcode could be placed not exactly in the center of an image and cold have a little bit different brightness. I saw threads mentioning OpenCV capabilities to find out placement of speciects objects on the image, like this one, but they are quite old. Is something changed since then? And what should i yet certainly consider to write 100% reliable datamatrix decoder (and detector) in my specific situation?
I decided to supply LuminanceSource and BinaryBitmap images made of .toString() text output of correcponding objects for reference:
http://s28.postimg.org/l53sykhx9/Binary_Bitmap.png
and /65z0vlbpl/Luminance_Source.png (at the same domain). They are looking good and ready for decoding, but what is wrong with decoding then.
After all this image and similar ones recognized and decoded very well with smartphone software and i'm just wanted achieve same results.
you need to enable it from settings programmatically or manually.
in class DecodeThread.java you can see the line that enables data matrix encoding
decodeFormats.addAll(DecodeFormatManager.DATA_MATRIX_FORMATS);

How to get a good quality thumbnail

I am trying to create a high quality thumbnail of this image, with Java and Scalr 3.2
This is the relevant source code, where THUMB_WIDTH = 77 and THUMB_HEIGHT = 57
BufferedImage srcImg = ImageIO.read(new File(sourceFile));
BufferedImage dstImg = Scalr.resize(srcImg, Scalr.Method.QUALITY,
THUMB_WIDTH, THUMB_HEIGHT);
ImageIO.write(dstImg, format, new File(destFile));
If I use format = "png", here is the result:
If I use format = "jpg", here is the result:
With imagemagick identify I've found out that the JPEG is saved with a quality of 75 that is totally insufficient to create a good looking thumbnail. The PNG doesn't look good either to me.
Here is the output of identify of the original file and the two thumbnails:
$ identify 42486_1.jpg 42486_s1.jpg 42486_s1.png
42486_1.jpg JPEG 580x435 580x435+0+0 8-bit DirectClass 50.6KB 0.000u 0:00.000
42486_s1.jpg[1] JPEG 77x58 77x58+0+0 8-bit DirectClass 2.22KB 0.000u 0:00.000
42486_s1.png[2] PNG 77x58 77x58+0+0 8-bit DirectClass 12.2KB 0.000u 0:00.000
Questions
How to improve the quality of the generated thumbnail?
How to save a JPEG with a higher quality? I'd like to try with higher quality and compare the results. I couldn't find anything in the JavaDoc for ImageIO.write.
Why I tell Scalr that my maximum dimensions are 77x57 and it output an image 77x58? I think that is to maintain the proportion, but those are my maximum width and maximum height. Width or height could be less but not more.
UPDATE: With a web search I found an article about how to adjust JPEG image compression quality. I wrote my own method to save a BufferedImage setting the quality:
/**
* Write a JPEG file setting the compression quality.
*
* #param image
* a BufferedImage to be saved
* #param destFile
* destination file (absolute or relative path)
* #param quality
* a float between 0 and 1, where 1 means uncompressed.
* #throws IOException
* in case of problems writing the file
*/
private void writeJpeg(BufferedImage image, String destFile, float quality)
throws IOException {
ImageWriter writer = null;
FileImageOutputStream output = null;
try {
writer = ImageIO.getImageWritersByFormatName("jpeg").next();
ImageWriteParam param = writer.getDefaultWriteParam();
param.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
param.setCompressionQuality(quality);
output = new FileImageOutputStream(new File(destFile));
writer.setOutput(output);
IIOImage iioImage = new IIOImage(image, null, null);
writer.write(null, iioImage, param);
} catch (IOException ex) {
throw ex;
} finally {
if (writer != null) writer.dispose();
if (output != null) output.close();
}
}
Here are the results. PNG:
JPEG quality 75:
JPEG quality 90 (the gravatars on stackoverflow are saved as JPEG quality 90):
and the filesize:
thumb90.jpg JPEG 77x58 77x58+0+0 8-bit DirectClass 6.89KB 0.000u 0:00.000
UPDATE 2: test to compare Scalr with java-image-scaling.
private void scaleAndSaveImageWithScalr(String sourceFile, String destFile, int width, int height)
throws IOException {
BufferedImage sourceImage = ImageIO.read(new File(sourceFile));
BufferedImage destImage = Scalr.resize(sourceImage, Scalr.Method.QUALITY, width, height);
writeJpeg(destImage, destFile, JPEG_QUALITY);
}
private void scaleAndSaveImageWithJImage(String sourceFile, String destFile, int width, int height)
throws IOException {
BufferedImage sourceImage = ImageIO.read(new File(sourceFile));
ResampleOp resampleOp = new ResampleOp(width, height);
resampleOp.setFilter(ResampleFilters.getLanczos3Filter());
resampleOp.setUnsharpenMask(AdvancedResizeOp.UnsharpenMask.Normal);
BufferedImage destImage = resampleOp.filter(sourceImage, null);
writeJpeg(destImage, destFile, JPEG_QUALITY);
}
JPEG quality 90 generated with Scalr:
JPEG quality 90 generated with java-image-scaling:
I didn't receive any further feedback, so my personal conclusion is that java-image-scaling provides superior quality, and so it's the library that I choose.
#Stivlo, I am sorry for not replying to this, I never got any notification from SO about the question.
java-image-scaling does have some nice filters to help with fine-tuning if you need it. That said, in v4.2 of imgscalr I added the new ULTRA_QUALITY that might get you closer to what you want.
I hope that helps, but realize this is being replied to almost a year after the fact unfortunately. Sorry about that.
This is not a complete answer to your question, but:
Regarding JPEG quality:
Compression quality can be set using a ImageWriteParam as described here. They suggest using an int value of 0|1 but I believe that you should actually specify a float value between 0.0 and 1.0.
Regarding your scaling dimension issues:
From the Scalr homepage:
NOTE: If a width and height are provided that violate the image’s
proportions (e.g. attempt to resize an 800×600 image to a 150×150
square) the library will first look at the orientation of the image
(landscape/square or portrait) and then
select the primary dimension
(landscape or square uses width, portrait uses height) to recalculate
a correct secondary dimension; ignoring what was passed in by the user
that was violating the proportions.
In your case the primary dimension will be a width of 77 and thus your height limit of 57 will be ignored.
I have run the same tests and java-image-scaling definitively have better results for thumbnails smaller than 250px. It also support sharp filtering, which make the results better.
I keep both libraries since the syntax of Scalr is often easier, with only one line.
Note that if your images have an alpha channel, both libraries are problematic. I'm only talking about shrinking images, I haven't tested enlarging them.
java-image-scaling may create an ugly border around the transparent edges depending on the image, and this looks very bad. I found no way to avoid this.
Scalr is only problematic using the (ultra) quality modes. It can easily be used in a way that works fine, though: bicubic interpolation leaves artifacts in transparent images, so you may want to avoid it. Since it's the default for (ultra) quality images, and scaleImageIncrementally() is protected you'd have to subclass it for this, though, if you want the quality (a fraction higher than 2 looks very blurry with bilinear filtering, though).
If you want high quality result, so use [RapidDecoder][1] library. It is simple as follow:
import rapid.decoder.BitmapDecoder;
...
Bitmap bitmap = BitmapDecoder.from(getResources(), R.drawable.image)
.scale(width, height)
.useBuiltInDecoder(true)
.decode();
Don't forget to use builtin decoder if you want to scale down less than 50% and a HQ result.

Categories

Resources