HttpUrlConnection gets response body on connect() - java

Consider the following code.
try {
httpURLConnection = (HttpURLConnection) new URL(strings[0]).openConnection();
httpURLConnection.setConnectTimeout(Config.HTTP_CONNECTION_TIMEOUT);
httpURLConnection.setReadTimeout(Config.HTTP_CONNECTION_TIMEOUT);
httpURLConnection.connect();
responseCode = httpURLConnection.getResponseCode();
httpURLConnection.getHeaderFields();
}
finally {
httpURLConnection.disconnect();
}
The issue is even when I don't use the InputStream to read the response, in my Internet/Wifi connection logs I can see the response-body. What I want is simply to check a field in the header and based upon that field I will continue reading the InputStream.
My questions are these:
Is it correct behavior for the connected stream to automatically download all/partial file even before a BufferedInputStream is created and read from?
If yes, then is it possible to stop the file download until an InputStream is used to read the response?
If not then is there something I am doing wrong or missing?

The response includes both the header and the body, the server does not stop for the client to acknowledge the headers before sending the body.
At the time the client is able to read the response code from the headers, a part of the body has already been sent, the size of which depends on the network latency, buffering, ....
The current implementation of HttpURLConnection.getResponseCode() even use getInputStream() to ensure that the connection is in the correct state.
The client can choose to ignore the body, but it's usually not recommended, because it may prevent a persistent connection to be reused.
I am not sure about Android but since Java 6, a background thread is automatically used to read the remaining data.
If If-Modified-Since is not an option, why not use a HEAD request ? :
The HTTP HEAD method requests the headers that are returned if the
specified resource would be requested with an HTTP GET method. Such a
request can be done before deciding to download a large resource to
save bandwidth, for example.

Related

can I send same HTTP request with same HttpURLConnection multiple times?

In order to send a string data to the server once, I do as below:
Make “HttpURLConnection” to my URL address and open it
Set the required headers
for the connection I Set setDoOutput to True
new a DataOutputStream from my connection and finally write my string data to it.
HttpURLConnection myConn = (HttpURLConnection);
myUrl.openConnection();
myConn.setRequestProperty("Accept", "application/json, text/plain, */*");
myConn.setDoOutput(true);
DataOutputStream my_output = new DataOutputStream(myConn.getOutputStream());
my_output.write(myData.getBytes("UTF-8"));
But what to do if I want to send exactly the same data with same URl and headers multiple times?
Can I write to it multiple times?(I mean that is it possible to use the last line of code multiple times?) Or should I repeat the above steps and try it with a new connection?
And if yes should I wait for some second or millisecond before sending the next one?
I also searched for some other alternatives such as “HttpClient” Http API and making synchronous Http request which as far as I got can help me setting the headers only once.
At the end, I appreciate your help and support and any other alternatives would be welcome.
Thanks a million.
I understand that the question has be answered in the comments, but I am leaving this here so that future viewers can see it.
An HTTP request contains 3 main parts:
Request Line: Method, Path, Protocol
Headers: Key-Pairs
Body: Data
Running my_output.write() will just add bytes to the body until my_output.flush() has been executed. Flushing the stream will write the data to the server.
Because HTTP requests are usually closed by the server once all data has been sent/received, whether or not you create a new connection or just add on to the body depends on your intentions. Typically, clients will create a new connection for each request because each response should be handled individually, rather than sending a repetitive body. This will vary though because some servers choose to hold a connection (such as WebSockets).
If you are open to external libraries, you may find this chart insightful:
AsyncHttpClient would be a good fit for your intentions.
Alternatively, you can use cURL by running a terminal command with Runtime.getRuntime().exec(). More information about using cURL with POST Requests can be found here. While cURL is efficient, you have to depend on the fact that your OS supports the command (though usually all devices that can run Java have this command).

HttpUrlConnection reading chunked response

I'm working on a project where i have to use HttpUrlConnection (Android~) for reading the input stream.
It turns out that when i'm reading the input stream the data is malformed and has a bigger size from the original content (which is sent by the server). Now, the server response header contains both "Content-Length" and "Transfer-Encoding: chunked", which from what i know is an issue as both of them shouldn't coexist.
Aside from that the input stream received from HttpUrlConnection contains all body content (with chunks offset informations).
I have two questions:
Shouldn't the HttpUrlConnection handle chunked data?
How to get the data from input stream without chunked informations?
The HttpUrlConnection should be handling chunked data, you're correct. The fact that you're seeing these headers at all means they're probably being malformed somewhere, and something has already sent either a \n\n or \r\n\r\n, so the HttpUrlConnection views it as part of the actual transmission.
If you WANT to be getting the raw data, use a socket and connect to the url on the correct port (probably 80, 443 for ssl)
EDIT: java.net.URLConnection states under the connect() method
Interact with the resource; query header fields and contents.
This shows that a URLConnection, prior to reading anything in from any sort of provided reader, queries the header information. Pardon me for not including this the first time.

Multiple HttpURLConnection calls for get throwing Premature end of file exception with InputStream

I'm trying to make multiple calls to a REST API using HttpURLConnection with GET method and retrieving the response through InputStream.
It worked fine previously for 3 months but now it's throwing below exception:
SAXException Occurred during getArtifactsUrl method:: org.xml.sax.SAXParseException; Premature end of file.
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source) [xercesImpl.jar:6.1.0.Final]
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source) [xercesImpl.jar:6.1.0.Final]
at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:121) [:1.7.0_03]
Below is the line of code where I'm making the second call to parse the response:
request = (HttpURLConnection) endpointUrl.openConnection();
inputstream = request.getInputStream();
doc = dBuilder.parse(inputstream);
First call is working fine using request and inputstream objects but second call is failing. I tried all possible answers I found in google but no luck:
after every call:
inputstream.close();
request.disconnect();
Remember that request is an HttpURLConnection object.
I greatly appreciate if you can be able to solve this as I this is a high prioirity production issue now!
First you should check for error cases and not assume it's always working.
Try this:
request = (HttpURLConnection) endpointUrl.openConnection();
request.connect(); // not really necessary (done automatically)
int statusCode = request.getResponseCode();
if (statusCode == 200) { // or maybe other 2xx codes?
// Success - should work if server gives good response
inputstream = request.getInputStream();
// if you get status code 200 and still have the same error, you should
// consider logging the stream to see what document you get from server.
// (see below *)
doc = dBuilder.parse(inputstream);
} else {
// Something happened
// handle error, try again if it makes sense, ...
if (statusCode == 404) ... // resource not found
if (statusCode == 500) ... // internal server error
// maybe there is something interesting in the body
inputstream = request.getErrorStream();
// read and parse errorStream (but probably this is not the body
// you expected)
}
Have a look at the List of HTTP status codes.
And in some nasty cases, there are other problems which are not easy to detect if you just sit behind HttpURLConnection. Then you could enable logging or snoop the TCP/IP traffic with an apropriate tool (depends on your infrastructure, rights, OS, ...). This SO post might help you.
*) In your case I suppose that you're getting a non-error status code from the server but unparseable XML. If logging the traffic is not your thing, you could read the InputStream, write it to a file and then process the stream like before. Of course you can write the stream to a ByteArrayOutputStream, get the byte[] and write that Bytes to a file and then convert them to a ByteArrayInputStream and give this to your XML-parser. Or you could use Commons IO TeeInputStream to handle that for you.
There are cases where connection.getResponseCode() throws an exception. Then it was not possible to parse the HTTP header. This should only happen if there are strange errors in server software, hardware or perhaps a firewall, proxy or load balancer not behaving well.
One more thing: You might consider choosing an HTTP Client library and not directly use HttpURLConnection.

How i reset an URL connection

I use URL connection to download stream in the Internet. But after i reset the modem, i can't continue download this stream caz it error: Connection reset. How i solve it?
Here is my code:
URL url = new URL(_URL);
HttpURLConnection hUC = (HttpURLConnection) url.openConnection();
hUC.connect();
while (true) {
if ((_data.num = is.read(_data.b)) == -1) {
break;
}
//write to file
fos.write(_data.b, 0, _data.num);
}
You can't - at least, not how you may be expecting.
Instead, you need to handle your exception, and determine how much data you've already read. Once your Internet connection is re-established - assuming that the HTTP server you're downloading from supports requestable byte ranges - you can then set custom HTTP Headers on the request and re-download the remaining portions. (This will require a new HttpURLConnection.)
http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.35 shows the related HTTP specifications involved to make this work.
This is a bit more complicated if you're looking for a "resume" type feature.
You would need to reissue the request once the internet comes back after a disconnect, and add a header to the request in order to resume the download at the byte number where you left off.
You need to set the Range property in the request header in order to specify how far in you're resuming. Then you would just continue to write to the "fos" object from there.
Check out this url: Java: resume Download in URLConnection

Sending an error response with com.sun.net.httpserver.HttpServer

I'm an experienced Java programmer but a newbie web developer. I'm trying to put together a simple web service using the HttpServer class that ships with JDK 1.6. From the examples I've viewed, some typical code from an HttpHandler's handle method would look something like this:
Headers responseHeaders = exchange.getResponseHeaders();
responseHeaders.set("Content-Type", "text/plain");
exchange.sendResponseHeaders(200, 0);
OutputStream responseBody = exchange.getResponseBody();
responseBody.write(createMyResponseAsBytes());
responseBody.close();
My question: What happens if I send a response header to indicate success (i.e. response code 200) and perhaps begin to stream back data and then encounter an exception, which would necessitate sending an "internal server error" response code along with some error content? In other words, what action should I take given that I've already sent a partial "success" response back to the client at the point where I encounter the exception?
200 is not sent until you either flush the stream or close it.
But once it is sent, there is nothing you can do about it.
Usually it may happen only when you have a really large amount of data and you use chunking.

Categories

Resources