My app saves some images in internal storage when the app is loading.
The problem is that the app is using the most common code for this situation, which is compressing the image before saving it, but this compressing process takes too long to be processed. For example, if there are 5 pictures to be loaded, the app take about 30 seconds to load and open the home screen. 30 seconds is too long to open an app.
My code to save the image is the following:
public static final boolean savePngLocalStorage(String fileName, Bitmap bitmap, Context context) throws IOException {
BufferedOutputStream bos = null;
Bitmap tmp = null;
try {
bos = new BufferedOutputStream(context.openFileOutput(fileName, Context.MODE_PRIVATE)); //他アプリアクセス不可
tmp = bitmap.copy(Config.ARGB_8888, true);
return tmp.compress(Bitmap.CompressFormat.PNG, 100, bos);
} finally {
if (tmp != null) {
tmp.recycle();
tmp = null;
}
//
try {
bos.close();
} catch (Exception e) {
//IOException, NullPointerException
}
}
}
Using debug, I realized that tmp.compress command is the one that take some time to be processed.
I tried to use the following code without compressing the image. It got a bit faster.
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setDoInput(true);
connection.connect();
ContextWrapper c = new ContextWrapper(MainActivity.this);
File path = c.getFilesDir();
String fileName = imageIdArray[i] + ".png";
InputStream input = new BufferedInputStream(url.openStream());
OutputStream output = new FileOutputStream(path + "/" + fileName); // "data/data/[package_name]/files/sample.png"
byte data[] = new byte[1024];
long total = 0;
int count;
while ((count = input.read(data)) != -1) {
total += count;
output.write(data, 0, count);
}
output.flush();
output.close();
input.close();
Are there other ways to save the image faster?
FileOutputStream out = null;
String path = setOutputFilePath();
try {
out = new FileOutputStream(path);
croppedBitmap2.compress(Bitmap.CompressFormat.JPEG, 100, out); // bmp is your Bitmap instance
// PNG is a lossless format, the compression factor (100) is ignored
LOGGER.debug("Saving image on the absolute path folder!");
} catch (Exception e) {
e.printStackTrace();
} finally {
try {
if (out != null) {
out.close();
}
} catch (IOException e) {
e.printStackTrace();
}
}
Related
When I am downloading a file from the server if suppose I killed or destroy the application means it will download only half of data how to resume download when application open or how to delete incomplete data in the file.
Any ideas?
private void downloadBookDetails(String pMainFolder, String pFileName, String pDownloadURL) {
Log.i(TAG, "Coming to this downloadBookDetails ");
try {
URL url = new URL(pDownloadURL);
URLConnection ucon = url.openConnection();
ucon.setReadTimeout(5000);
ucon.setConnectTimeout(10000);
InputStream is = ucon.getInputStream();
BufferedInputStream inStream = new BufferedInputStream(is, 1024 * 5);
File directory = new File(pMainFolder, pFileName);
FileOutputStream outStream = new FileOutputStream(directory);
byte[] buff = new byte[5 * 1024];
int len;
while ((len = inStream.read(buff)) != -1) {
outStream.write(buff, 0, len);
}
outStream.flush();
outStream.close();
inStream.close();
} catch (Exception e) {
//Add Network Error.
Log.e(TAG, "Download Error Exception " + e.getMessage());
e.printStackTrace();
}
}
You should use DownLoad Manager for downloads in your app. This will automatically handles all the things for you. Which is a system service that can handle long-running HTTP downloads.
UPDATE
If you want to download the file by your own then you can use it like below:
#SuppressLint("Wakelock")
#Override
protected String doInBackground(String... sUrl) {
// take CPU lock to prevent CPU from going off if the user
// presses the power button during download
PowerManager pm = (PowerManager) context.getSystemService(Context.POWER_SERVICE);
PowerManager.WakeLock wl = pm.newWakeLock(PowerManager.PARTIAL_WAKE_LOCK,
getClass().getName());
wl.acquire();
try {
InputStream input = null;
OutputStream output = null;
HttpURLConnection connection = null;
try {
URL url = new URL(sUrl[0]);
connection = (HttpURLConnection) url.openConnection();
File SDCardRoot = Environment.getExternalStorageDirectory();
File file = new File(SDCardRoot,"/"+fileName);
int downloaded=0;
if(file.exists()){
downloaded=(int) file.length();
connection.setRequestProperty("Range", "bytes=" + (int) file.length() + "-");
}
else{
file.createNewFile();
}
connection.setDoInput(true);
connection.setDoOutput(true);
connection.connect();
// expect HTTP 200 OK, so we don't mistakenly save error report
// instead of the file
// this will be useful to display download percentage
// might be -1: server did not report the length
int fileLength = connection.getContentLength()+(int)file.length();
// download the file
input = connection.getInputStream();
if(downloaded>0){
output = new FileOutputStream(file,true);
}
else{
output = new FileOutputStream(file);
}
byte data[] = new byte[1024];
long total = downloaded;
int count;
mProgressDialog.setMax(fileLength/1024);
while ((count = input.read(data)) != -1) {
// allow canceling with back button
if (isCancelled())
return null;
total += count;
// publishing the progress....
if (fileLength > 0) // only if total length is known
publishProgress((int)total/1024);
output.write(data, 0, count);
}
output.flush();
if (output != null)
output.close();
if (input != null)
input.close();
if (connection != null)
connection.disconnect();
wl.release();
return null;
} catch (Exception e) {
return e.toString();
}
}
catch (Exception e) {
return e.toString();
}
}
Question at the bottom
I'm using netty to transfer a file to another server.
I limit my file-chunks to 1024*64 bytes (64KB) because of the WebSocket protocol. The following method is a local example what will happen to the file:
public static void rechunck(File file1, File file2) {
FileInputStream is = null;
FileOutputStream os = null;
try {
byte[] buf = new byte[1024*64];
is = new FileInputStream(file1);
os = new FileOutputStream(file2);
while(is.read(buf) > 0) {
os.write(buf);
}
} catch (IOException e) {
Controller.handleException(Thread.currentThread(), e);
} finally {
try {
if(is != null && os != null) {
is.close();
os.close();
}
} catch (IOException e) {
Controller.handleException(Thread.currentThread(), e);
}
}
}
The file is loaded by the InputStream into a ByteBuffer and directly written to the OutputStream.
The content of the file cannot change while this process.
To get the md5-hashes of the file I've wrote the following method:
public static String checksum(File file) {
InputStream is = null;
try {
is = new FileInputStream(file);
MessageDigest digest = MessageDigest.getInstance("MD5");
byte[] buffer = new byte[8192];
int read = 0;
while((read = is.read(buffer)) > 0) {
digest.update(buffer, 0, read);
}
return new BigInteger(1, digest.digest()).toString(16);
} catch(IOException | NoSuchAlgorithmException e) {
Controller.handleException(Thread.currentThread(), e);
} finally {
try {
is.close();
} catch(IOException e) {
Controller.handleException(Thread.currentThread(), e);
}
}
return null;
}
So: just in theory it should return the same hash, shouldn't it? The problem is that it returns two different hashes that do not differ with every run.. file size stays the same and the content either.
When I run the method once for in: file-1, out: file-2 and again with in: file-2 and out: file-3 the hashes of file-2 and file-3 are the same! This means the method will properly change the file every time the same way.
1. 58a4a9fbe349a9e0af172f9cf3e6050a
2. 7b3f343fa1b8c4e1160add4c48322373
3. 7b3f343fa1b8c4e1160add4c48322373
Here is a little test that compares all buffers if they are equivalent. Test is positive. So there aren't any differences.
File file1 = new File("controller/templates/Example.zip");
File file2 = new File("controller/templates2/Example.zip");
try {
byte[] buf1 = new byte[1024*64];
byte[] buf2 = new byte[1024*64];
FileInputStream is1 = new FileInputStream(file1);
FileInputStream is2 = new FileInputStream(file2);
boolean run = true;
while(run) {
int read1 = is1.read(buf1), read2 = is2.read(buf2);
String result1 = Arrays.toString(buf1), result2 = Arrays.toString(buf2);
boolean test = result1.equals(result2);
System.out.println("1: " + result1);
System.out.println("2: " + result2);
System.out.println("--- TEST RESULT: " + test + " ----------------------------------------------------");
if(!(read1 > 0 && read2 > 0) || !test) run = false;
}
} catch (IOException e) {
e.printStackTrace();
}
Question: Can you help me chunking the file without changing the hash?
while(is.read(buf) > 0) {
os.write(buf);
}
The read() method with the array argument will return the number of files read from the stream. When the file doesn't end exactly as a multiple of the byte array length, this return value will be smaller than the byte array length because you reached the file end.
However your os.write(buf); call will write the whole byte array to the stream, including the remaining bytes after the file end. This means the written file gets bigger in the end, therefore the hash changed.
Interestingly you didn't make the mistake when you updated the message digest:
while((read = is.read(buffer)) > 0) {
digest.update(buffer, 0, read);
}
You just have to do the same when you "rechunk" your files.
Your rechunk method has a bug in it. Since you have a fixed buffer in there, your file is split into ByteArray-parts. but the last part of the file can be smaller than the buffer, which is why you write too many bytes in the new file. and that's why you do not have the same checksum anymore. the error can be fixed like this:
public static void rechunck(File file1, File file2) {
FileInputStream is = null;
FileOutputStream os = null;
try {
byte[] buf = new byte[1024*64];
is = new FileInputStream(file1);
os = new FileOutputStream(file2);
int length;
while((length = is.read(buf)) > 0) {
os.write(buf, 0, length);
}
} catch (IOException e) {
Controller.handleException(Thread.currentThread(), e);
} finally {
try {
if(is != null)
is.close();
if(os != null)
os.close();
} catch (IOException e) {
Controller.handleException(Thread.currentThread(), e);
}
}
}
Due to the length variable, the write method knows that until byte x of the byte array, only the file is off, then there are still old bytes in it that no longer belong to the file.
i got the image from post response
PostMethod post = new PostMethod(action);
HttpClient httpClient = createHttpClient();
........
httpClient.executeMethod(post);
try {
log.info("post successfully");
String contentType = post.getResponseHeader("Content-type").getValue();
int contentLength = (int) post.getResponseContentLength();
byte[] responseBody = FileUtils.convertInputStreamtoByteArray(post.getResponseBodyAsStream());
log.info("get response sucessfully : size "+ responseBody.length +" contentLength " + contentLength);
return new ReturnBean(null, responseBody,contentType,contentLength);
} catch (Exception e) {
log.error(e.getMessage());
log.error(e.getStackTrace());
e.printStackTrace();
throw new ResponseFailedException(e.getMessage());
}
this is how i convert inputstream to byte array.
public static byte[] convertInputStreamtoByteArray(InputStream is){
ByteArrayOutputStream baos = new ByteArrayOutputStream();
try {
byte[] buf = new byte[1024];
int i = 0;
while ((i = is.read(buf)) >= 0) {
baos.write(buf, 0, i);
}
is.close();
} catch (Exception ex) {
ex.printStackTrace();
}
return baos.toByteArray();
}
this is how i return the image as a response.
byte[] imageSource = (byte[])returnStream.getBean();
log.info("imageSource " + imageSource.length);
getResponse().setContentType((String) returnStream.getBean2());
getResponse().setContentLength((Integer) returnStream.getBean3());
getResponse().getOutputStream().write(imageSource);
getResponse().getOutputStream().flush();
i was able to print out the image but im having a problem because the bottom part of it is missing . i checked the size of byte that i got and it is equal to the size of actual image.
when i used IOUtils.copyLarge(); instead of my method convertInputStreamtoByteArray
ServletOutputStream outputStream = getResponse().getOutputStream();
InputStream inputStream = (InputStream) returnStream.getBean();
IOUtils.copyLarge(inputStream , outputStream);
it works . i dont know what happen because i used it a while ago and it didnt work.
I am fetching images from Facebook and writing them to SD card, but the image quality is very low. Following is my code to fetch and write:
try
{
URL url = new URL(murl);
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setDoInput(true);
connection.connect();
InputStream input = connection.getInputStream();
Bitmap myBitmap = BitmapFactory.decodeStream(input);
data1 = String.valueOf(String.format(getActivity().getApplicationContext().getFilesDir()+"/Rem/%d.jpg",System.currentTimeMillis()));
FileOutputStream stream = new FileOutputStream(data1);
ByteArrayOutputStream outstream = new ByteArrayOutputStream();
myBitmap.compress(Bitmap.CompressFormat.JPEG, 100, outstream);
byte[] byteArray = outstream.toByteArray();
stream.write(byteArray);
stream.close();
}
catch (Exception e)
{
e.printStackTrace();
}
The following code I use to display the same image:
File IMG_FILE = new File(IMAGE_CONTENT);
B2.setVisibility(View.INVISIBLE);
Options options = new BitmapFactory.Options();
options.inScaled = false;
options.inDither = false;
options.inPreferredConfig = Bitmap.Config.ARGB_8888;
Bitmap bitmap = BitmapFactory.decodeFile(IMG_FILE.getAbsolutePath(),options);
iM.setImageBitmap(bitmap);
The quality is still low even after using Options. What can be done to improve this?
to Save image from URL onto SD card use this code
try
{
URL url = new URL("Enter the URL to be downloaded");
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
urlConnection.setRequestMethod("GET");
urlConnection.setDoOutput(true);
urlConnection.connect();
File SDCardRoot = Environment.getExternalStorageDirectory().getAbsoluteFile();
String filename="downloadedFile.png";
Log.i("Local filename:",""+filename);
File file = new File(SDCardRoot,filename);
if(file.createNewFile())
{
file.createNewFile();
}
FileOutputStream fileOutput = new FileOutputStream(file);
InputStream inputStream = urlConnection.getInputStream();
int totalSize = urlConnection.getContentLength();
int downloadedSize = 0;
byte[] buffer = new byte[1024];
int bufferLength = 0;
while ( (bufferLength = inputStream.read(buffer)) > 0 )
{
fileOutput.write(buffer, 0, bufferLength);
downloadedSize += bufferLength;
Log.i("Progress:","downloadedSize:"+downloadedSize+"totalSize:"+ totalSize) ;
}
fileOutput.close();
if(downloadedSize==totalSize) filepath=file.getPath();
}
catch (MalformedURLException e)
{
e.printStackTrace();
}
catch (IOException e)
{
filepath=null;
e.printStackTrace();
}
Log.i("filepath:"," "+filepath) ;
return filepath;
use this code to set sdcard image as your imageview bg
File f = new File("/mnt/sdcard/photo.jpg");
ImageView imgView = (ImageView)findViewById(R.id.imageView);
Bitmap bmp = BitmapFactory.decodeFile(f.getAbsolutePath());
imgView.setImageBitmap(bmp);
else use this
File file = ....
Uri uri = Uri.fromFile(file);
imgView.setImageURI(uri);
You can directly show image from web without downloading it. Please check the below function . It will show the images from the web into your image view.
public static Drawable LoadImageFromWebOperations(String url) {
try {
InputStream is = (InputStream) new URL(url).getContent();
Drawable d = Drawable.createFromStream(is, "src name");
return d;
} catch (Exception e) {
return null;
}
}
then set image to imageview using code in your activity.
The issue is that you're dealing with a lossy format (JPG) and are re-compressing the image. Even with quality at 100 you still get loss - you just get the least amount.
Rather than decompressing to a Bitmap then re-compressing when you write it to the file, you want to download the raw bytes directly to a file.
...
InputStream is = connection.getInputStream();
OutputStream os = new FileOutputStream(data1);
byte[] b = new byte[2048];
int length;
while ((length = is.read(b)) != -1) {
os.write(b, 0, length);
}
is.close();
os.close();
...
I have some code that zips up a file sends it over the network and then unzips it on the other end. I'm still testing the code and the source and the destination are the same. Zipping up the file takes on the order of a minute. Unzipping the file takes on the order of an hour. I'm think there must be a flaw in my code to have such a large difference. Here's the code to unzip:
public String uncompressLocalZip(String filename,String strUUID,ParentEntry pe,boolean bControlFileProgress) {
final int BUFFER = 2048;
BufferedOutputStream out = null;
ZipInputStream zis = null;
try {
FileInputStream fis = new FileInputStream(Constants.conf.getFileDirectory() + Constants.PATH_SEPARATOR + strUUID + Constants.PATH_SEPARATOR + filename);
zis = new ZipInputStream(new BufferedInputStream(fis));
ZipEntry entry;
long totallength = 0;
long size = 0;
if (pe !=null)
size = pe.getSize();
while((entry = zis.getNextEntry()) != null) {
System.out.println("Extracting: " +entry);
int count;
byte data[] = new byte[BUFFER];
// write the files to the disk
File fileOutput = new File(Constants.conf.getFileDirectory() + Constants.PATH_SEPARATOR + strUUID + Constants.PATH_SEPARATOR + Constants.conf.getUncompressFolderName() + Constants.PATH_SEPARATOR + entry.getName());
new File(fileOutput.getParent()).mkdirs();
BufferedOutputStream fos = new BufferedOutputStream(new FileOutputStream(fileOutput));
out = new BufferedOutputStream(fos, BUFFER);
while ((count = zis.read(data, 0, BUFFER)) != -1) {
out.write(data, 0, count);
totallength += count;
}
out.flush();
}
}
catch(Exception e) {
e.printStackTrace();
return("FAILED");
}
finally {
try {if ( out!= null) out.close();} catch (IOException ioe) {}
try {if ( zis!= null) zis.close();} catch (IOException ioe) {}
}
return("SUCCESS");
}
Here's the code to zip:
public void createLocalZip(String filename,ProcessEntry pe) {
ZipOutputStream out=null;
try {
File fileOutput = new File (filename);
out = new ZipOutputStream(new BufferedOutputStream(new FileOutputStream(fileOutput)));
long totallength=0;
long size = pe.getParentEntry().getSize();
String strStartDirectory;
if (pe.getParentEntry().isDirectory())
strStartDirectory=pe.getParentEntry().getUrl();
else
strStartDirectory=pe.getParentEntry().getFolder();
for (int i=0;i<pe.getParentEntry().tableModel3.getRowCount();i++) {
FileEntry fe = pe.getParentEntry().tableModel3.getFileEntry(i);
File fileInput = new File (fe.getUrl());
FileInputStream input = new FileInputStream(fileInput);
BufferedInputStream in = new BufferedInputStream(input);
String strRelativeDir = fe.getUrl().substring(strStartDirectory.length()+1,fe.getUrl().length());
ZipEntry entry = new ZipEntry(strRelativeDir);
out.putNextEntry(entry);
byte[] bbuf = new byte[2048];
int length=0;
while ((in != null) && ((length = in.read(bbuf)) != -1)) {
out.write(bbuf,0,length);
totallength += length;
pe.setProgress((int) (totallength*100/size));
}
in.close();
}
}
catch (Exception e) {
System.out.println(e.getMessage());
}
finally {
try {if (out!=null) out.close();} catch(IOException ioe){}
}
}
Update: The compression ratio for this particular test is about 90% (1.2GB down to about 100MB). So I suppose it could be the extra writing to disk for unzipping vs. zipping, although I would expect close to a 10X differential vs 60X.
don't double wrap your OutputStream with BufferedOutputStream (you only need 1 BufferedOutputStream wrapper), and close it after you are done writing to it.
also, ZipEntrys can be directories, so check that and handle accordingly.
I have no really big file to test your code, so I can only guess.
You say your uncompressed zip size is more than 1 GB. This could be more than fits in your memory, and if something forces the VM to fit everything in memory, it will have to swap. Observe your program with a profiler.
Make sure your close each FileOutputStream after writing to it. (You create lots of them, and only close the last one.)
I'm not sure about the ZipInputStream implementation (maybe it forces your BufferedStream to buffer much of data). You could try ZipFile instead (which allows random access, basically).
Consider using a specialized library to do the zipping/unzipping. http://sevenzipjbind.sourceforge.net/ might help.