Downloading files in Java and common errors - java

I wrote a simple downloader as Java applet. During some tests I discover that my way of downloading files is not even half as perfect as e.g. Firefox's way of doing it.
My code:
InputStream is = null;
FileOutputStream os = null;
os = new FileOutputStream(...);
URL u = new URL(...);
URLConnection uc = u.openConnection();
is = uc.getInputStream();
final byte[] buf = new byte[1024];
for(int count = is.read(buf);count != -1;count = is.read(buf)) {
os.write(buf, 0, count);
}
Sometimes my applet works fine, sometimes unexpected things happen. E.g. from time to time, in the middle of downloading applet throws an IO exception or just lose a connection for a while, without possibility to return to current download and finish it.
I know that really advanced way is too complicated for single unexperienced Java programmer, but maybe you know some techniques to minimalise risk of appearing these problems.

So you want to resume your download.
If you get an IOException on reading from the URL, there was a problem with the connection.
This happens. Now you must note how much you already did download, and open a new connection which starts from there.
To do this, use setRequestProperty() on the second, and send the right header fields for "I want only the range of the resource starting with ...". See section 14.35.2 Range Retrieval Requests in the HTTP 1.1 specification. You should check the header fields on the response to see if you really got back a range, though.

Related

Url Encoding Issue - Special Characters originate webserver crash

I have been searching about this info but since I'm new to web development the answers I'm getting are getting me even more confused.
Basically, I have a webserver established in a Java Modem (which uses 1.3IDE) which will handle requests. These requests were being processed as long as I kept it simple.
http://87.103.87.59/teste.html?a=10&b=10
This request is normally processed.
However, when applying the real deal, my webserver is crashing.
http://5.43.52.4/api.html?ATCOMMAND=AT%5EMTXTUNNEL=SMS,0035111111111,string sending test
The problem is due to two aspects. The "%" character and the string sending test.
To put everything clear, handlers I'm using are these:
public InputStream is = null;
private OutputStream os = null;
private byte buffer[] = new byte[];
String streamAux="";
is = socketX.openInputStream();
os = socketX.openOutputStream();
if ((is.available()>0)||(blockX==true))
{
//Read data sent from remote client
numDadosLidos=is.read(buffer);
for (int i=0;i<numDadosLidos;i++)
streamAux= streamAux + (char)buffer[i]; //where the url will be stored
Basically I will need those parameters so I can use them to operate my Java device so, I think I'll need to do some sort of encoding but there's a lot of information that I can't comprehend and my 1.3 IDE is kind of keeping me stuck.
I apologize for some sort of newbie behaviour in advance.
Hope you can lend me a hand,
Thanks
For those who are interested, I basically went around the issue obliging the message to be sent using '-' character. It doesn't solve the issue, it simply solves the question with the "not-ideal" method.
Still totally interested if someone figures this one out.
Thanks.

Downloading xml data on very slow 2G internet

i have successfully programmed an app that takes Traces of allot of system services (GPS location, Network Location, wifi, neighbouringcellinfo, sensors ....) in every 10 seconds which works very well, but as soon as i restrict the internet on my phone to use only 2G and turn off Wifi i get the app still works but start to lag.
I have tried to find out where the Problem is coming from and i have noticed that it comes from this code line:
XmlPullParser receivedData = XmlPullParserFactory.newInstance()
.newPullParser().setInput(xmlUrl.openStream(), null); return receivedData;
As soon as i delete this couple of code lines in my activity the app works without lagging, but seeing as they are essential for my app i would very much like to have them work (which they already do) but without causing lags.
Can anyone please help me?
I have printed the parsed result from the XML file and it is correct, so my only problem here is the lagging of the app.
A typical XML file that i would be dealing with looks like this:
<rsp stat="ok">
<cell lat="49.88415658974359" lon="8.637537076923078" mcc="262" mnc="7" lac="41146"
cellid="42404" averageSignalStrength="-79" samples="39" changeable="1"/>
</rsp>
2g is some really slow connection. Even worse is the "warm up" of the antenna. It may last up to 30 seconds, before teh first bit is received. (And there is not really something you can do against this, because it is all about physics).
So the only thing you could do is loading the File in an background-Thread. This will make the appp resonsive (if yo don't need the data in time).
Mayb explicitly a BufferedInputStream
XmlPullParser receivedData = XmlPullParserFactory.newInstance()
.newPullParser().setInput(xmlUrl.openStream(), null);
XmlPullParser receivedData = XmlPullParserFactory.newInstance()
.newPullParser().setInput(
new BufferedInputStream(xmlUrl.openStream()), null);
Maybe, maybe compression
As you know in HTTP a browser may declare in its headers that it can decompress compressed data; and then the server may send a compressed version of the HTML. This serves to put less load on the server side, and may speed up communication, depending.
The same one can do oneself.
For an external uncontrolled site one might try. Send a header
Accept-Encoding: gzip
And one is lucky when receiving a response header:
Content-Encoding: gzip
Doing boing sides oneselfby wrapping the streams:
outputStream = new GZipOutputStream(outputStream);
inputStream = new GZipImüputStream(inputStream);
Saving memory
To make the same string instancees unique reduces memory and might help, even if it costs considerable time itself. String.intern() is bad idea, as prior to java 8, the strings go into the permanent (unrecoverable) memory space. One might use a
private Map<String, String> identityMap = new HashMap<>();
public String unique(String s) {
if (s.length() >= 30) {
return s;
}
String t = identityMap.get(s);
if (t == null) {
t = s;
identityMap.put(s, t);
}
return t;
}
The hope is, that processing becomes faster.

Java 7 URL Connection Fail

The following code used to work fine under Java 6 (and earlier) but it stopped working after updating to JRE 7 (Java 7).
The URL is an FTP file:
ftp://ftp-private.ncbi.nlm.nih.gov/pubchem/.fetch/96/4133257873201306969.sdf.gz
Here is the output I get:
application/octet-stream
-1 [Ljava.lang.StackTraceElement;#5419f97c
And here is my code:
public static void store(URL url, File targetFile){
try
{
System.out.println(url);
URLConnection uc = url.openConnection();
String contentType = uc.getContentType();
System.out.println(contentType);
int contentLength = uc.getContentLength();
System.out.println(contentLength);
Settings.setDownloadSize(contentLength);
if (contentType.startsWith("text/") || contentLength == -1) {
throw new IOException("This is not a binary file.");
}
InputStream raw = uc.getInputStream();
InputStream in = new BufferedInputStream(raw);
byte[] data = new byte[contentLength];
int bytesRead = 0;
StatusPanel.updateProgrssBar(bytesRead);
int offset = 0;
while (offset < contentLength) {
bytesRead = in.read(data, offset, data.length - offset);
if (bytesRead == -1) {
break;
}
offset += bytesRead;
StatusPanel.updateProgrssBar(offset);
}
in.close();
if (offset != contentLength) {
throw new IOException("Only read " + offset + " bytes; Expected " + contentLength + " bytes");
}
FileOutputStream out = new FileOutputStream(targetFile);
out.write(data);
out.flush();
out.close();
//StatusPanel.setStatus("File has been stored at " + targetFile.toString());
//System.out.println("file has been stored at " + targetFile.toString());
}
The content length returns -1:
Area: API: Networking
Synopsis: Server Connection Shuts Down when Attempting to Read Data When http Response Code is -1
How do I make this code compatible with Java 7?
Description: As a result of the bug fix for CR 6886436, the HTTP protocol handler will close the connection to a server that sends a response without a valid HTTP status line. When this occurs, any attempt to read data on that connection results in an IOException.
For example, the following code is problematic:
public static void test () throws Exception {
.....
HttpURLConnection urlc = (HttpURLConnection)url.openConnection();
....
System.out.println ("Response code: " + urlc.getResponseCode());
/** Following line throws java.io.IOException: Invalid Http response
* when Response Code returned was -1
*/
InputStream is = urlc.getInputStream(); // PROBLEMATIC CODE
To work around this problem, check the return value from the getResponseCode method and deal with a -1 value appropriately; perhaps by opening a new connection, or invoking getErrorStream on the stream.
Nature of incompatibility: behavioral
RFE: 7055058
The problem is definitely with getContentLength() method.
With JRE6, this method returns a value, but with JRE7 I get -1.
Based on the Java 7's Javadoc of URLConnection there are two possible reasons this is happening.
The first possible cause is that the content length is greater than Integer.MAX_VALUE. To determine if this is the issue I would use getContentLengthLong() because this returns a long instead of an int and if the content length is greater than Integer.MAX_VALUE getContentLength() will return -1. Also, since Java 7 it is preffered to use getContentLengthLong() over getContentLength() as stated in the Java 7's URLConnection Javadoc, "it returns a long instead and is therefore more portable." If you desire to use both JRE 6 and 7 I would create a Java 6 and 7 wrapper classes to create a set of methods that your application uses to interact with URLs. Than in your application's start script check if the host has JRE 6 or 7 and load the proper wrapper class according to the JRE version. This is generally a good design because it prevents your application from being dependent on one specific JRE, third party library or application, etc.
The second possibility is that the content-length header field is not known by the server so the getContentLength() or getContentLengthLong() method returns a value of -1. This is why I suggest trying getContentLengthLong() before anything else because it will probably be the quickest fix. If both methods return -1 I would suggest using an application like [Apache JMeter][11] to determine the header information. A quick way of doing this is to have JMeter "HTTP Proxy Server" running with your browser's proxy settings set to go to use localhost as the address and the port you set the HTTP Proxy Server for the port. The information recorded will appear as individual elements themselves and if you expand them there should be a HTTP Header Manager that contains the name of each Header with its value next to it.
Lastly, you may want to do analysis on the server itself to see if there are any issues. Verify logs look ok, that all the correct processes are up, configuration's are set correctly, the file still exists and is in the correct location, etc. Maybe the server is not set to respond to content length requests anymore. Also, verify if your code functions with JRE 7 on another host
I hope these suggestions will be of value to you and that your are able to solve this issue you seem to be having. I would also note that you really should consider using a wrapper class and following the notes for each version of third party class that you use in the future so that you follow better practices that are easier to maintain like reducing the amount of external dependencies you have by using wrapper classes.

Storing java objects online

this is my first question on stack overflow, I hope you can help me. I've done a bit of searching online but I keep finding tutorials or answers that talk about reading either text files using a BufferedReader or reading bytes from files on the internet. Ideally, I'd like to have a file on my server called "http://ascistudent.com/scores.data" that stores all of the Score objects made by players of a game I have made.
The game is a simple "block-dropping" game where you try to get 3 of the same blocks touching do increase the score. When time runs out, the scores are loaded from a file, their score is added in the right position of a List of Score objects. After that the scores are saved again to the same file.
At the moment I get an exception, java.io.EOFException on the highlighted line:
URL url = new URL("http://ascistudent.com/scores.data");
InputStream is = url.openStream();
Score s;
ObjectInputStream load;
//if(is.available()==0)return;
load = new ObjectInputStream(is); //----------java.io.EOFException
while ((s = (Score)load.readObject()) != null){
scores.add(s);
}
load.close();
I suspect that this is due to the file being empty. But then when I catch this exception and tell it to write to the file anyway (after changing the Score List) with the following code, nothing appears to be written (the exception continues to happen.)
URL url = new URL("http://ascistudent.com/scores.data");
URLConnection ucon = url.openConnection();
ucon.setDoInput(true);
ucon.setDoOutput(true);
os = ucon.getOutputStream();
ObjectOutputStream save = new ObjectOutputStream(os);
for(Score s:scores){
save.writeObject(s);
}
save.close();
What am I doing wrong? Can anyone point me in the right direction?
Thanks very much,
Luke
Natively you can't write to an URLConnection unless that connection is writable.
What I mean is that you cannot direcly write to an URL unless the otherside accept what you are going to send. This in HTTP is done throug a POST request that attaches data from your client to the request itself.
On the server side you'll have to accept this post request, take the data and add it tothe scores.data. You can't directly write to the file, you need to process the request in the webserver, eg:
http://host/scores.data
provides the data, while
http://host/uploadscores
should be a different URL that accepts a POST request, process it and remotely modifies score.data

Effective way of doing http queries to a server on Java

I'm working on a software that does extensive queries to a database which is has a http interface. So my program parses and handles queries that are in form of long http:// addresses..
I have realized that the bottleneck of this whole system is the querying and the data transfer barely goes above 20KB/s even though I am sitting in the university network with a gigabit connection. Recently a friend of mine mentioned that I might have written my code in an ineffective way and that might be reason for the lack of speed in the process. So my question is what is the fastest/most effective way of getting data from a web source in Java.
Here's the code I have right now:
private void handleQuery(String urlQuery,int qNumber, BufferedWriter out){
BufferedReader reader;
try{
// IO - routines: read from the webservice and print to a log file
reader = new BufferedReader(new InputStreamReader(openURL(urlQuery)));
....
}
}
private InputStream openURL(String urlName)
throws IOException
{
URL url = new URL(urlName);
URLConnection urlConnection = url.openConnection();
return urlConnection.getInputStream();
}
Your code looks good to me. The code snippet doesn't explain the slow read.
Possible problems are,
Network issues. Do an end-end network test to make sure network is as fast as you think.
Server issues. Maybe the server is too slow.
Thread contention. Check if you have any thread issues.
A profiler and network trace will pin-point the problem.
There is nothing in the code that you have provided that should be a bottleneck. The problem is probably somewhere else; e.g. what you are doing with the characters after you read them, how the remote server is writing them, network or webproxy issues, etc.

Categories

Resources