I developed a mass file up loader (a swing application) recently.One of the new requirements is to support uploading thousands of documents (GIF,JPG,PNG,DOCX,XLSX), each of these are like 3MB-10MB of size and we don't want to upload these huge files, we generally support TIFF files which has small byte size like 60KB-100KB. We are not concerned about the image quality, all we need to upload these docs for future reference. Right now I don't have any idea how to solve this problem, I started researching it. Please point me in right direction.
-PD
My first approach would be to convert them to pdf files. Everything that can be printed can be converted to pdf. This also allows for image compression. Tiff won't be a good idea for doc/xls I think, it might make them bigger.
a .doc or .xlsx can be gzipped very quickly for decent savings.
Images are more risky, depends on what the data is. Pictures of people? Pictures of text?
Related
As a challenge and practicality, I have a few images I want to include inside my jar for a mod I'm developing. But as is the compiled jar with the images ends up weighing over 25 MBs, which is not ideal. So my idea is to simply compress them and add that compressed package as a resource. But doing so doesn't appear to shrink the file size that much. So the two questions I want to ask (if that plan makes sense) what's the best way to compress a bunch of images and then being able to utilize them while running the game/program?
Lossy image compression is one way to reduce file size, but when dealing with images that cover the entire screen the drop in quality can easily be noticed, which is not what I want.
Another solution is to simply not have it part of the mod and instead needs to be downloaded individually and read from it. But that's a little cheap and still would have to download the same amount of MBs to get the full experience.
Note. I don't need to compress them in my program, just read from them when they are compressed.
I was trying to open jpeg files in a Java program and noticed that neither ImageIO nor the Apache commons imaging library tools could open the images. The commons library showed me this error:
"Only sequential, baseline JPEGs are supported at the moment"
So, my image files are compressed in a way both libraries aren't able to read. I could make an ImageJ macro and transform all the images first but I would like to just use my program and not something extra.
Is there a way to find the compression mode in jpeg or even a java library that can read jpegs in several modes?
Thanks in advance
Suddenly it works with the ImageIO standard Class. Don't ask me why. Thanks for your help guys.
You need to get some tool that will allow you to dump a JPEG stream. There are a number of them out there.
What you are looking for is the start of frame marker.
FFC0 indicates baseline sequential.
FFC1 indicates extended sequential. It doesn't take much more code to do extended sequential than baseline. It is puzzling why a decoder would limit itself to baseline these day.
FFC2 is progressive.
There are others but those are the only ones you are likely to encounter and are widely supported.
You just need to find a tool that will save in baseline format. Finding one to read the other two formats is easy.
I have REST service that allows users to upload images and then serves those images. The images are stored to a database. I'd like to do JPG optimization for these images.
There's several command line tools to do this but I'd like to do it without first saving them to disk and then running some command-line tools. I'd rather use some Java library to directly operate on a binary stream that contains the image data.
What I'm after is a treatment similar to what for example Trimage does:
Remove all EXIF metadata from the image
Losslessy (re)compressed to the highest available compression levels
Is this possible?
It is possible to remove extraneous APPn and COM markers from the data stream. You can do that without expanding and recompressing.
Each time you expand and decompress with different quantization tables, you loose data in JPEG. There is no real point in decompression.
Yes to question #1. No to question #2.
I have an application where users can upload images.
This images are usually taken directly from cameras and in 1mb sizes or more.
May I know what is the standard way of compressing this images before saving them to database as BLOB?
The images stored in database are just for viewing, there is no requirement to edit the image.
I have read this:
Compress Image before Saving to disk in Java
But I am wondering there are more standard ways so that the system can be more maintainable.
some codes and links will be greatly appreciated +1
Which is faster, to recieve images as JPG from server, and save it, then show it in a ListView or what ever
OR
receive images in an XML (as encoded String) from server, then decode it, then show it in a ListView (or even save it then show it from memory)
what is the best way (regarding performance) in transfering images FROM a server TO an android device
thanks in advance
That depends on where the limitation is. If the bandwidth of your connection is very small try to transfer as few bytes as possible. However, high compression usually costs more CPU, so if CPU power is limiting you it may be better to use a lower compression.
I am not sure what you mean by "receive images in an XML". Is it some vector format like SVG? That would normally be much smaller than a raster image (especially for large sizes).
To sum up, you will need to do some experiments to find out what works best in your case.
The best way is to get the image URL from the server and then download the image using some image managing library like https://github.com/nostra13/Android-Universal-Image-Loader or google's volley. These kind of libraries highly configurable and taking care of all the aspects of managing a bitmap.