Android copying large inputstream to file very slow - java

I have an app that is downloading a zip file and then copying this file to a temporary file on the sd card on the phone, but it is being very very slow.
InputStream in = new BufferedInputStream(url.openStream(), 1024);
File tempFile = File.createTempFile("arc", ".zip", targetDir); //target dir is a file
String tempFilePath = tempFile.getAbsolutePath();
OutputStream out = new BufferedOutputStream(new FileOutputStream(tempFile));
//copying file (in different void)
byte[] buffer = new byte[8192];
int len;
len = in.read(buffer);
enter code here
//it loops here for AGES
while (len >= 0) {
out.write(buffer, 0, len);
len = in.read(buffer);
}
in.close();
out.close();
My file is about 20MB, initially I had the buffer size of 1024, and changed it to 8192 thinking it may speed it up but it seemed to make no difference? I always finishes, and I get no errors it just takes ages!
I have searched to try and find a solution but I'm not coming up with anything so I may be going about this totally the wrong way?
Can anyone see what I'm doing wrong?
Bex

Donot increase buffer size. That may cause your application MemoryOutOfBoundsException.
There are varous factors for which your download will be slow. Weak internet connection, Weak file transfer and receiveing mode is also responsible. It also depend on capacity of device. Check whether you are using following code to create inputstream
URL u = new URL("enter url url here");
HttpURLConnection c = (HttpURLConnection) u.openConnection();
c.setRequestMethod("GET");
c.setDoOutput(true);
c.connect();
InputStream in = c.getInputStream();
Thanks
Deepak

Related

How InputStream really works while reading file from socket in java?

I have a simple program which gets BufferedInputStream from URL and I have seen that while reading from the underlying stream, read(bytes) calls goes to FileInputStream from BufferedInputStream (so for this I convinced my self saying as at the other end of socket , it is actually a file may be that's why it goes to FileInputStreams (Please let me know if my assumptions are correct about this )).
When read happens in FileInputStreams read() method the "path" variable is set to the location of my class file from where the read call is being invoked , well this is very confusing to me as I was expecting the file's actual URL location here which I am downloading ..
Please help me understand these things and how actually read() happens from a remote file ??
URL url = new URL("some url for downloading a file");
BufferedInputStream bis = new BufferedInputStream(url.openStream());
FileOutputStream fis = new FileOutputStream(file);
int size = 65536;
byte[] buffer = new byte[size];
int count;
while ((count = bis.read(buffer, 0, size)) != -1) {
fis.write(buffer, 0, count);
}
fis.close();
bis.close();

ftp csv file download with java

I successfully download a csv file stored in ftp server using the code below.
URL url = new URL(ftpUrl);
URLConnection conn = url.openConnection();
InputStream inputStream = conn.getInputStream();
long filesize = conn.getContentLength();
byte[] buffer = new byte[4096];
int bytesRead = -1;
while ((bytesRead = inputStream.read(buffer)) != -1) {
String str = new String(buffer, "UTF-8");
out.println(str);
}
The csv file has 1276KB filesize and about 20.000 rows. The problem is that the generated csv file has some lines either blank or with missing information. The damaged rows occur about every 100 records. I tried to fix by increasing the buffer size but damaged rows still exist.
Any help is going to be appreciated :)
Sure! you have to tell how long is the string: except when the file size is a multiple of 4096, you will always get old items in buffer
You must use this syntax instead
String str = new String(buffer, 0, bytesRead, "utf-8");

Stream file from URL to File without storing it in the memory

I want to download a file from a URL and store it into the file system. However I have memory limitation and I don't want to store it in the memory before. I am not a java expert and I am a bit lost with all the class InputStream, BufferedReader, FileOutputStream, etc. Could you help me please ?
For now I have:
URLConnection ucon = url.openConnection();
ucon.connect();
InputStream is = ucon.getInputStream();
// Create a reader for the input stream.
BufferedReader br = new BufferedReader(isr);
// ?
FileOutputStream fos = context.openFileOutput(FILENAME, Context.MODE_PRIVATE);
// Here the content can be too big for the memory...
fos.write(content.getBytes());
fos.close();
Please, could you give me some clue ? I was thinking also to read it chunk by chunk, but I am not sure what would be the easiest with java...
you can use apache commons
org.apache.commons.io.FileUtils.copyURLToFile(URL, File)
I guess it may not work on android
I use this code
InputStream input = connection.getInputStream();
byte[] buffer = new byte[4096];
int cnt = - 1;
OutputStream output = new FileOutputStream(file);
while ( (cnt = input.read(buffer)) != -1)
{
output.write(buffer, 0, cnt);
}
output.close();

Need help optimising bufferedReader output

I am sending a file to the browser in a servlet. The highest JDK I can use is 1.4.2, and I also have to retrieve the file via a URL. I am also trying to use "guessContentTypeFromStream", but I keep getting null which raises an exception when used in the code sample below. I currently have to hard code or work out the content-type myself.
What I would like to know is, how I can re-factor this code so the file transmission is as fast as possible and also use guessContentTypeFromStream ? (Note "res" is HttpServletResponse).
URL servletUrl = new URL(sFileURL);
URLConnection conn = servletUrl.openConnection();
int read;
BufferedInputStream bis = new BufferedInputStream(conn.getInputStream());
String sContentType =conn.guessContentTypeFromStream(conn.getInputStream());
res.setContentType(sContentType);
//res.setContentType("image/jpeg");
PrintWriter os = res.getWriter();
while((read = bis.read()) != -1){
os.write(read);
}
//Clean resources
os.flush();
This is how you normally read/writes data.
in = new BufferedInputStream(socket.getInputStream(), BUFFER_SIZE);
byte[] dataBuffer = new byte[1024 * 16];
int size = 0;
while ((size = in.read(dataBuffer)) != -1) {
out.write(dataBuffer, 0, size);
}

URL Connection (FTP) in Java - Simple Question

I have a simple question. I'm trying to upload a file to my ftp server in Java.
I have a file on my computer, and I want to make a copy of that file and upload it. I tried manually writing each byte of the file to the output stream, but that doesn't work for complicated files, like zip files or pdf files.
File file = some file on my computer;
String name = file.getName();
URL url = new URL("ftp://user:password#domain.com/" + name +";type=i");
URLConnection urlc = url.openConnection();
OutputStream os = urlc.getOutputStream();
//then what do I do?
Just for kicks, here is what I tried to do:
OutputStream os = urlc.getOutputStream();
BufferedReader br = new BufferedReader(new FileReader(file));
String line = br.readLine();
while(line != null && (!line.equals(""))) {
os.write(line.getBytes());
os.write("\n".getBytes());
line = br.readLine();
}
os.close();
For example, when I do this with a pdf and then try and open the pdf that I run with this program, it says an error occurred when trying to open the pdf. I'm guessing because I am writing a "\n" to the file? How do I copy the file without doing this?
Do not use any of the Reader or Writer classes when you're trying to copy the byte-for-byte exact contents of a binary file. Use these only for plain text! Instead, use the InputStream and OutputStream classes; they do not interpret the data at all, while the Reader and Writer classes interpret the data as characters. For example
OutputStream os = urlc.getOutputStream();
FileInputStreamReader fis = new FileInputStream(file);
byte[] buffer = new byte[1000];
int count = 0;
while((count = fis.read(buffer)) > 0) {
os.write(buffer, 0, count);
}
Whether your URLConnection usage is correct here, I don't know; using Apache Commons FTP (as suggested elsewhere) would be an excellent idea. Regardless, this would be the way to read the file.
Use a BufferedInputStream to read and BufferedOutputStream to write. Take a look at this post: http://www.ajaxapp.com/2009/02/21/a-simple-java-ftp-connection-file-download-and-upload/
InputStream is = new FileInputStream(localfilename);
BufferedInputStream bis = new BufferedInputStream(is);
OutputStream os =m_client.getOutputStream();
BufferedOutputStream bos = new BufferedOutputStream(os);
byte[] buffer = new byte[1024];
int readCount;
while( (readCount = bis.read(buffer)) > 0) {
bos.write(buffer, 0, readCount);
}
bos.close();
FTP usually opens another connection for data transfer.
So I am not convinced that this approach with URLConnection is going
to work.
I highly recommend that you use specialized ftp client. Apache commons
may have one.
Check this out
http://commons.apache.org/net/api/org/apache/commons/net/ftp/FTPClient.html

Categories

Resources