Java http request poor performance - java

This is the code I'm using to get response text.
private static String request(String urlstr){
// create connection
try {
URL url = new URL(urlstr);
URLConnection conn = url.openConnection();
StringBuilder response = new StringBuilder();
conn.setUseCaches(false);
conn.setRequestProperty("User-Agent", USER_AGENT);
// read response
BufferedReader in = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String line = null;
while ((line = in.readLine()) != null) {
response.append(line);
}
in.close();
return response.toString();
} catch (Exception e){
return null;
}
}
The problem is that when querying the very same request (simple get request, response is json) with my chrome browser I get the response almost 1 second faster than with this code in my application.
I wonder if theres anything I'm doing wrong in my code? Or it is chrome handling that request faster somehow?
Maybe there are some techniques to make this process faster?
Thanks

You seem to read the response line by line, but in the end you append every line to one single response, so it is not really required to read the response line by line. You can also do
char[] cbuf = new char[1024];
int len;
while ((len = in.read(cbuf)) != -1)
response.append(cbuf, 0, len);
Like this the response can be read in much larger chunks and you don't have the overhead of the readLine() method that has to look for newline characters in the input and split the content into lines.
You could also do a
new StringBuilder(connection.getContentLength());
to avoid that the StringBuilder has to increase it's capacity every time new content is appended. The StringBuilder is using a char[] internally and every time the array is not big enough for the new content it has to be copied to a new array with a larger size.

Related

Android HttpUrlConnection result string truncated

I am currently developing an Android application and encounter the following problem.
I am making an HTTP request to a server that is supposed to send me back XML content that I then parse. I noticed recurring errors while parsing long XML strings so I decided to display the result of my requests and discovered that the string (or the stream?) that I receive is randomly truncated. Sometimes I get the whole string, sometimes half, sometimes a third, and it seems to follow a certain pattern in the amount of characters that are truncated, what I mean by that is that I sometimes get 320 characters after a request then 156 after the next then 320 twice, then 156 again (these aren't the actual numbers but it follows a pattern).
Here is my code for the request and conversion of the InputStream into a string:
private String downloadUrlGet(String myurl) throws IOException {
InputStream is = null;
// Only display the first 20000 characters of the retrieved
// web page content.
int len = 20000;
try {
URL url = new URL(myurl);
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setReadTimeout(10000 /* milliseconds */);
conn.setConnectTimeout(15000 /* milliseconds */);
conn.setRequestMethod("GET");
conn.setDoInput(true);
conn.setRequestProperty("Content-Type", "application/xml");
// Starts the query
conn.connect();
int response = conn.getResponseCode();
Log.d(DEBUG_TAG, "The response is: " + response);
is = conn.getInputStream();
// Convert the InputStream into a string
String contentAsString = readIt(is, len);
return contentAsString;
// Makes sure that the InputStream is closed after the app is
// finished using it.
} finally {
if (is != null) {
is.close();
}
}
}
// Reads an InputStream and converts it to a String.
private String readIt(InputStream stream, int len) throws IOException, UnsupportedEncodingException {
Reader reader = null;
reader = new InputStreamReader(stream, "UTF-8");
char[] buffer = new char[len];
reader.read(buffer);
return new String(buffer);
}
The length of the XML that I try to retrieve is much less than 20000.
I tried to use HttpURLConnection.setChunkedStreamingMode() with 0 and various other numbers as parameter but it didn't change anything.
Thanks in advance for any suggestions.
You are making the usual mistake of assuming that read() fills the buffer. See the Javadoc. It isn't obliged to do that. It isn't obliged to transfer more than one byte as a matter of fact. You need to read in a loop until you have encountered end of stream (read() returns -1).

getContentLength returning 1225, real length 3365

I am currently working with android and i am using a http connection with some headers (i havent included them or the real url for security purposes) to get a JSON response from an API, and feeding that response back into the application. The problem that i am having is that when using the getContentLength method of the http request, the wrong length is being returned (wrong length returned is 1225, the correct length in characters of the JSON array is 3365).
I have a feeling that the JSON is not fully loaded when my reader starts to read it, and as such is only reading the loaded JSON at that point. Is there any way around this, possibly using a delay on the HTTP connection or waiting until it is fully loaded to read the data?
URL url = new URL("https://www.exampleofurl.com");
HttpURLConnection request = (HttpURLConnection) url.openConnection();
request.connect();
int responseCode = request.getResponseCode();
if(responseCode == HttpURLConnection.HTTP_OK) {
InputStream inputStream = request.getInputStream();
InputStreamReader reader = new InputStreamReader(inputStream);
long contentLength2 = Long.parseLong(request.getHeaderField("Content-Length"));
Log.i("contentLength: ", "Content: " + contentLength2);
I generally don't recommend always relying on "Content-Length" as it may not be available (you get -1), or perhaps affected by intermediate proxy.
Why don't you just read your stream until it is exhausted into memory buffer (say, StringBuilder) and then get the actual size, for example :
BufferedReader br = new BufferedReader(inputStream); // inputStream in your code
String line;
StringBuilder sb = new StringBuilder();
while ((line = br.readLine()) != null) {
sb.append(line);
}
// finished reading
System.out.println("data size = " + sb.length());
JSONObject data = new JSONObject(sb.toString());
// and don't forget finally clauses with closing streams/connections

Result Coming back from the Server is cutting off and throw not Valid JSON Exception

I am calling a wcf web service in android. All appear to work fine until I call a method that is returning quite a bit of data. It appears the result is being cut off and throwing not a valid json object. I check the result, and it return about 9089 characters. The bufferedreader and inputstreamreader have both return the same count with the same result. I try calling a different method that is returning about 2000 records, and it work without problems.
Here is the sample code where I am reading the result:
StringEntity entity = new StringEntity(jsonObject.toString());
httpPost.setEntity(entity);
HttpResponse httpResponse = httpClient.execute(httpPost);
HttpEntity responseEntity = httpResponse.getEntity();
char[] buffer = new char[(int)responseEntity.getContentLength()];
InputStream stream1 =responseEntity.getContent();
InputStreamReader reader = new InputStreamReader(stream1);
reader.read(buffer);
stream1.close();
String sInvokeReturnValue = new String(buffer);
Any kind of help would be greatly appreciated.
All appear to work fine until I call a method that is returning quite a bit of data. It appears the result is being cut off and throwing not a valid json object.
I suspect that your read is hitting a buffer limit and returning not as many bytes as you are expecting. You should put your read in a loop to make sure you get all of the input.
What I would recommend is to use a StringWriter to read in your data. For example:
// not much point on allocating a huge buffer here
char[] buffer = new char[4096];
InputStreamReader reader = new InputStreamReader(responseEntity.getContent());
try {
StringWriter writer = new StringWriter();
while (true) {
int numRead = reader.read(buffer);
if (numRead < 0) {
break;
}
writer.write(buffer, 0, numRead);
}
} finally {
reader.close();
}
stream1.close();
String sInvokeReturnValue = writer.toString();

How to read a textual HTTP response into a String exactly as-is?

The following code uses BufferedReader to read from an HTTP response stream:
final StringBuilder responseBuilder = new StringBuilder();
String line = bufferedReader.readLine();
while (line != null) {
responseBuilder.append(line);
responseBuilder.append('\n');
line = bufferedReader.readLine();
}
response = responseBuilder.toString();
But appending '\n' to each line seems a bit flawed. I want to return the HTTP response exactly as-is so what if it doesn't have a return character after the last line? One would get added anyway using the code above - is there a better way?
I want to return the HTTP response exactly as-is
Don't use readLine() then - it's as simple as that. I'd suggest using a StringWriter instead:
StringWriter writer = new StringWriter();
char[] buffer = new char[8192];
int charsRead;
while ((charsRead = bufferedReader.read(buffer)) > 0) {
writer.write(buffer, 0, charsRead);
}
response = writer.toString();
Note that even this won't work if you get the encoding wrong. To preserve the exact HTTP response, you'd need to read (and write) it as a binary stream.

How to get the data from the internet more efficiently?

In my app, I make a request from a public URL, and then open the source code of the web page, finally, I extract the information that I want from the source code. I got no problem with this entire process. However, it takes so long to load up the information I want. Is there any other efficient way I can do?
public class GetMethodEx {
public String getInternetData(String currentUrl) throws Exception{
BufferedReader in = null;
String data = null;
try{
HttpClient client = new DefaultHttpClient();
URI website = new URI(currentUrl);
HttpGet request = new HttpGet();
request.setURI(website);
HttpResponse response = client.execute(request);
in = new BufferedReader(new InputStreamReader(response.getEntity().getContent()));
StringBuffer sb = new StringBuffer("");
String l = "";
String nl = System.getProperty("line.separator");
while((l = in.readLine()) !=null){
sb.append(l + nl);
}
in.close();
data = sb.toString();
return data;
}finally{
if (in != null){
try{
in.close();
return data;
}catch (Exception e){
e.printStackTrace();
}
}
}
}
}
Using a StringBuffer is really not efficient downloading large texts as a html file is one. Since you are reading line java has to allocate memory for each line you are reading just to copy everything that has been copied in memory into the StringBuffer what leads to intense GC work. Then a StringBuffer has a fixed size so your program may reach a point where the StringBuffers size is exceeded which leads to a resizing of the StringBuffer what causes copying everything inside the Buffer into a new one.
So you rather should try to get the size of html document you requested and read everything into a char array. That may not work since http allows to transfer data in chunks of variable size. This is an idea of what you can do if that is the case:
String html = "";
CharBuffer buff = CharBuffer.allocate(16384);
int read = in.read(buff);
while(read > -1) {
while(read > -1 && buff.remaining > 0) {
read = in.read(buff);
}
html += new String(buff.array());
buff.clear();
}

Categories

Resources