how does the increment of buffer affect the download speed? - java

I have the following code to download from the internet
URLConnection ucon = url.openConnection();
InputStream is = ucon.getInputStream();
BufferedInputStream inStream = new BufferedInputStream(is, 8192);//this is value one
FileOutputStream outStream = new FileOutputStream(new File(location));
byte[] buff = new byte[8192];//this is value two
int len;
while ((len = inStream.read(buff)) != -1) {
outStream.write(buff, 0, len);
}
My question is what happens if I change the the buffer value : value one and two
how does the download speed change ? should they be the same value ? if they are not the same what should each of them indicate ?
I have read the following post :
How do you determine the ideal buffer size when using FileInputStream?
but I did not really understand it . can someone explain it for my please

Related

ftp csv file download with java

I successfully download a csv file stored in ftp server using the code below.
URL url = new URL(ftpUrl);
URLConnection conn = url.openConnection();
InputStream inputStream = conn.getInputStream();
long filesize = conn.getContentLength();
byte[] buffer = new byte[4096];
int bytesRead = -1;
while ((bytesRead = inputStream.read(buffer)) != -1) {
String str = new String(buffer, "UTF-8");
out.println(str);
}
The csv file has 1276KB filesize and about 20.000 rows. The problem is that the generated csv file has some lines either blank or with missing information. The damaged rows occur about every 100 records. I tried to fix by increasing the buffer size but damaged rows still exist.
Any help is going to be appreciated :)
Sure! you have to tell how long is the string: except when the file size is a multiple of 4096, you will always get old items in buffer
You must use this syntax instead
String str = new String(buffer, 0, bytesRead, "utf-8");

Android ByteArrayBuffer holds more bytes than capacity

I found the following code in a project. It works and can read more than 20MB big files. From how the code is set up it should fail after 5000 bytes. Why does it work? The docs for ByteArrayBuffer indicate nothing like that. I have verified that the read-loop iterates for all bytes of each file.
http://developer.android.com/reference/org/apache/http/util/ByteArrayBuffer.html
URLConnection ucon = url.openConnection();
ucon.setReadTimeout(10000); // enables throw SocketTimeoutException
InputStream is = ucon.getInputStream();
long expectedFileSize = ucon.getContentLength();
Log.d("Downloader", "expected file size: " + expectedFileSize);
BufferedInputStream bis = new BufferedInputStream(is);
// Read bytes to the Buffer until there is nothing more to read(-1).
// This code can read 20MB and bigger mp3-files, ... why?!?
ByteArrayBuffer baf = new ByteArrayBuffer(5000);
int current = 0;
while ((current = bis.read()) != -1) {
baf.append((byte) current);
}
FileOutputStream fos = new FileOutputStream(file);
fos.write(baf.toByteArray());
fos.flush();
fos.close();
5000 is just an initial capacity. It gets resized automatically once reaches its limit

Stream file from URL to File without storing it in the memory

I want to download a file from a URL and store it into the file system. However I have memory limitation and I don't want to store it in the memory before. I am not a java expert and I am a bit lost with all the class InputStream, BufferedReader, FileOutputStream, etc. Could you help me please ?
For now I have:
URLConnection ucon = url.openConnection();
ucon.connect();
InputStream is = ucon.getInputStream();
// Create a reader for the input stream.
BufferedReader br = new BufferedReader(isr);
// ?
FileOutputStream fos = context.openFileOutput(FILENAME, Context.MODE_PRIVATE);
// Here the content can be too big for the memory...
fos.write(content.getBytes());
fos.close();
Please, could you give me some clue ? I was thinking also to read it chunk by chunk, but I am not sure what would be the easiest with java...
you can use apache commons
org.apache.commons.io.FileUtils.copyURLToFile(URL, File)
I guess it may not work on android
I use this code
InputStream input = connection.getInputStream();
byte[] buffer = new byte[4096];
int cnt = - 1;
OutputStream output = new FileOutputStream(file);
while ( (cnt = input.read(buffer)) != -1)
{
output.write(buffer, 0, cnt);
}
output.close();

Need help optimising bufferedReader output

I am sending a file to the browser in a servlet. The highest JDK I can use is 1.4.2, and I also have to retrieve the file via a URL. I am also trying to use "guessContentTypeFromStream", but I keep getting null which raises an exception when used in the code sample below. I currently have to hard code or work out the content-type myself.
What I would like to know is, how I can re-factor this code so the file transmission is as fast as possible and also use guessContentTypeFromStream ? (Note "res" is HttpServletResponse).
URL servletUrl = new URL(sFileURL);
URLConnection conn = servletUrl.openConnection();
int read;
BufferedInputStream bis = new BufferedInputStream(conn.getInputStream());
String sContentType =conn.guessContentTypeFromStream(conn.getInputStream());
res.setContentType(sContentType);
//res.setContentType("image/jpeg");
PrintWriter os = res.getWriter();
while((read = bis.read()) != -1){
os.write(read);
}
//Clean resources
os.flush();
This is how you normally read/writes data.
in = new BufferedInputStream(socket.getInputStream(), BUFFER_SIZE);
byte[] dataBuffer = new byte[1024 * 16];
int size = 0;
while ((size = in.read(dataBuffer)) != -1) {
out.write(dataBuffer, 0, size);
}

Android copying large inputstream to file very slow

I have an app that is downloading a zip file and then copying this file to a temporary file on the sd card on the phone, but it is being very very slow.
InputStream in = new BufferedInputStream(url.openStream(), 1024);
File tempFile = File.createTempFile("arc", ".zip", targetDir); //target dir is a file
String tempFilePath = tempFile.getAbsolutePath();
OutputStream out = new BufferedOutputStream(new FileOutputStream(tempFile));
//copying file (in different void)
byte[] buffer = new byte[8192];
int len;
len = in.read(buffer);
enter code here
//it loops here for AGES
while (len >= 0) {
out.write(buffer, 0, len);
len = in.read(buffer);
}
in.close();
out.close();
My file is about 20MB, initially I had the buffer size of 1024, and changed it to 8192 thinking it may speed it up but it seemed to make no difference? I always finishes, and I get no errors it just takes ages!
I have searched to try and find a solution but I'm not coming up with anything so I may be going about this totally the wrong way?
Can anyone see what I'm doing wrong?
Bex
Donot increase buffer size. That may cause your application MemoryOutOfBoundsException.
There are varous factors for which your download will be slow. Weak internet connection, Weak file transfer and receiveing mode is also responsible. It also depend on capacity of device. Check whether you are using following code to create inputstream
URL u = new URL("enter url url here");
HttpURLConnection c = (HttpURLConnection) u.openConnection();
c.setRequestMethod("GET");
c.setDoOutput(true);
c.connect();
InputStream in = c.getInputStream();
Thanks
Deepak

Categories

Resources