i'm wrote simple download manager and i'm trying to set RESUME for all downloads. after googleing for how to do that. i know must be setRequestProperty for connection, but my code does not work and i get this error:
FATAL EXCEPTION: Thread-882
java.lang.IllegalStateException: Cannot set request property after connection is made
at libcore.net.http.HttpURLConnectionImpl.setRequestProperty(HttpURLConnectionImpl.java:510)
My code is:
URL url = new URL(downloadPath);
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
final int fileSize = connection.getContentLength();
File file = new File(filepath);
if (file.exists() && fileSize == file.length()) {
return;
} else if (file.exists()) {
connection.setRequestProperty("Range", "bytes="+(file.length())+"-");
}else
connection.setRequestProperty("Range", "bytes=" + downloadedSize + "-");
connection.setRequestMethod("GET");
connection.setDoInput(true);
connection.setDoOutput(true);
connection.connect();
how to resolve this problem and correctly set setRequestProperty to connection?
The problem is that you're calling connection.getContentLength() before you're calling setRequestProperty(). The content length is only available after you've made a request, at which point you can't set the request property...
It's not entirely clear what you're trying to do, but one option is to use a HEAD request just to get the content length, and then make a separate request if you need to get just a portion of the data. Be aware that it's possible that the content length will change between requests, of course.
However, I would actually suggest keeping more metadata somewhere in your download manager - so that when you first start downloading the data, you keep a record of the total size, so that you don't need to make the HEAD request when resuming - you can tell just from the local information whether or not you've already downloaded a file. (This has the same problem in terms of content changing, but that's a different matter.)
I had the same error than OP.
WHY
The problem is that when you try to set the params to the request to resume the download, you have to be disconnected from the Http.
The moment you invoke the method connection.getContentLenght(); what happens is that connection.connect(); so if you then try to set the properties to the connection you will get the error mentioned.
FIX
What I did was that I closed the connection to Http after I invoked the method long totalFileSize = connection.getContentLength();
connection.disconnect()//Disconnect from http
And after that you can set the the parameters you want to the connection and invoke connection.connect() whenever needed.
TIP
In my particular case I was trying to download a file and needed to support resumable downloads, so to do it what I did was:
Check if file exists.
If file exists then get the lenght of the file:
long bytesDownloaded = file.getLenght();
Use this lenght to set it to the connection so it can resume from exactly the bytes it was paused.
Write the bytes to the end of the file.
You should set properties before getContentLength()
If you set range equal to exist file length you will receive remain bytes when call getContentLength() so if content length was equal to "0" that means that file downloaded completely.
But if you want to build a download manager, #Jon Skeet's method is reasonable.
Edit:
public abstract long getContentLength ()
Added in API level 1 Tells the length of the content, if known.
Returns the number of bytes of the content, or a negative number if
unknown. If the content length is known but exceeds Long.MAX_VALUE, a
negative number is returned.
Related
I'm sending zip file over FTP connection so to fetch file size , I have used :
URLConnection conn = imageURL.openConnection();
long l = conn.getContentLengthLong();
But it returns -1
Similarly for files sent over Http request , I get correct file size.
How to get correct file size in ftp connection in this case ?
for files sent over Http request , I get correct file size.
MAYBE. URLConnection.getContentLength[Long] returns specifically the content-length header. HTTP (and HTTPS) supports several different ways of delimiting bodies, and depending on the HTTP options and versions the server implements, it might use a content-length header or it might not.
Somewhat similarly, an FTP server may provide the size of a 'retrieved' file at the beginning of the operation, or it may not. But it never uses a content-length header to do so, so getContentLength[Long] doesn't get it. However, the implementation code does store it internally if the server provides it, and it can be extracted by the following quite ugly hack:
URL url = new URL ("ftp://anonymous:dummy#192.168.56.2/pub/test");
URLConnection conn = url.openConnection();
try( InputStream is = conn.getInputStream() ){
if( ! conn.getClass().getName().equals("sun.net.www.protocol.ftp.FtpURLConnection") ) throw new Exception("conn wrong");
Field fld1 = conn.getClass().getDeclaredField("ftp");
fld1.setAccessible(true); Object ftp = fld1.get(conn);
if( ! ftp.getClass().getName().equals("sun.net.ftp.impl.FtpClient") ) throw new Exception ("ftp wrong");
Field fld2 = ftp.getClass().getDeclaredField("lastTransSize");
fld2.setAccessible(true); long size = fld2.getLong(ftp);
System.out.println (size);
}
Hacking undocumented internals may fail at any time, and versions of Java above 8 progressively discourage it: 9 through 15 give a warning message about illegal access unless you use --add-opens to permit it and 16 makes it an error (i.e. fatal). Unless you really need the size before reading the data (which implicitly gives you the size) I would not recommend this.
Consider the following code.
try {
httpURLConnection = (HttpURLConnection) new URL(strings[0]).openConnection();
httpURLConnection.setConnectTimeout(Config.HTTP_CONNECTION_TIMEOUT);
httpURLConnection.setReadTimeout(Config.HTTP_CONNECTION_TIMEOUT);
httpURLConnection.connect();
responseCode = httpURLConnection.getResponseCode();
httpURLConnection.getHeaderFields();
}
finally {
httpURLConnection.disconnect();
}
The issue is even when I don't use the InputStream to read the response, in my Internet/Wifi connection logs I can see the response-body. What I want is simply to check a field in the header and based upon that field I will continue reading the InputStream.
My questions are these:
Is it correct behavior for the connected stream to automatically download all/partial file even before a BufferedInputStream is created and read from?
If yes, then is it possible to stop the file download until an InputStream is used to read the response?
If not then is there something I am doing wrong or missing?
The response includes both the header and the body, the server does not stop for the client to acknowledge the headers before sending the body.
At the time the client is able to read the response code from the headers, a part of the body has already been sent, the size of which depends on the network latency, buffering, ....
The current implementation of HttpURLConnection.getResponseCode() even use getInputStream() to ensure that the connection is in the correct state.
The client can choose to ignore the body, but it's usually not recommended, because it may prevent a persistent connection to be reused.
I am not sure about Android but since Java 6, a background thread is automatically used to read the remaining data.
If If-Modified-Since is not an option, why not use a HEAD request ? :
The HTTP HEAD method requests the headers that are returned if the
specified resource would be requested with an HTTP GET method. Such a
request can be done before deciding to download a large resource to
save bandwidth, for example.
I'm currently developing an app, that should measure (fairly precisely) the size of webpages.
The thing I'm struggling with now is that I need to know the sizes of particular files that are on the website. I have an array of URLs and I try to fetch their headers to get Content-Length, however some files return -1 since they are chunked. If they return -1 I try to download them to get their size.
And here lies the problem - I found out that I always get uncompressed version of the file.
Example file -
http://www.google-analytics.com/analytics.js
When I open it in Chrome, the headers says this:
However, when I download it using HttpURLConnection, it has a size of 25421 bytes, and when I check the Content-Encoding header, its always null.
connection = (HttpURLConnection)(new URL(url)).openConnection();
connection.setRequestProperty("Accept-Encoding", "gzip");
connection.connect();
int contentLength = connection.getContentLength();
if (contentLength == -1 && connection != null) {
InputStream input = connection.getInputStream();
byte[] buffer = new byte[4096];
int count = 0, len;
while ((len = input.read(buffer)) > 0) {
count += len;
}
contentLength = count;
}
So the problem is, that I download a webpage with my application, and it says it has (let's say) 400kB. But when I download it using some kind of tool, like http://tools.pingdom.com/fpt/ , the size is much smaller, like 100kB, since most of the scripts are gzipped, that means the transfer is lower.
I know 300kB is not that much, but when you are using a mobile transfer, every kB counts, and I want my app to be precise.
Could you point me where I make mistake, or how could I solve this?
Thank you
Your HttpURLConnection setup code looks correct to me. You could try setting the User-Agent to a standard browser one, perhaps the server is trying to be more intelligent than it ought to be. Failing that, run your traffic through a debugging proxy like Fiddler or Burp to see what's going on at the network level.
If you are using iJetty, you have to enable gzip compression first
You have to enable the GzipFilter to make Jetty return compressed content. Have a look here on how to do that: http://blog.max.berger.name/2010/01/jetty-7-gzip-filter.html
You can also use the gzip init parameter to make Jetty search for compressed content. That means if the file file.txt is requested, Jetty will watch for a file named file.txt.gz and returns that.
When opening a URLConnection I use the following code in order to get the content length, however it returns -1.
URL url = new URL(sUrl[0]);
URLConnection connection = url.openConnection();
connection.connect();
int fileLength = connection.getContentLength();
I presumed then that the server was not setting a content-length header (and a dig in the connection object confirms the value is -1), and so set one myself using the following in PHP:
header('Content-Length: '.strlen($output));
When I print out the value of strlen($output) I get the correct value, but this header does not seem to make it to Java.
Any suggestions or further code required?
Thanks
If the content length header is indeed being sent back to you from the server you are connecting to, then the code you have will work. You can prove that by hitting a simple web service that does return Content-Length like in the following code:
URL url = new URL("http://freegeoip.net/json/199.201.1.200");
URLConnection connection = url.openConnection();
int fileLength = connection.getContentLength();
System.out.println(fileLength);
When you run this, you will see it print out a content length.
Turns out content-length should be ignored when transfer-encoding is set to chunked. It would appear that my web host, takes this one further and strips out the header completely even if I set it manually in PHP. Confirmed with Chrome's advanced REST app.
I use URL connection to download stream in the Internet. But after i reset the modem, i can't continue download this stream caz it error: Connection reset. How i solve it?
Here is my code:
URL url = new URL(_URL);
HttpURLConnection hUC = (HttpURLConnection) url.openConnection();
hUC.connect();
while (true) {
if ((_data.num = is.read(_data.b)) == -1) {
break;
}
//write to file
fos.write(_data.b, 0, _data.num);
}
You can't - at least, not how you may be expecting.
Instead, you need to handle your exception, and determine how much data you've already read. Once your Internet connection is re-established - assuming that the HTTP server you're downloading from supports requestable byte ranges - you can then set custom HTTP Headers on the request and re-download the remaining portions. (This will require a new HttpURLConnection.)
http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.35 shows the related HTTP specifications involved to make this work.
This is a bit more complicated if you're looking for a "resume" type feature.
You would need to reissue the request once the internet comes back after a disconnect, and add a header to the request in order to resume the download at the byte number where you left off.
You need to set the Range property in the request header in order to specify how far in you're resuming. Then you would just continue to write to the "fos" object from there.
Check out this url: Java: resume Download in URLConnection