we are downloading a very large file (~70G) but one one occasion the code completed without throwing an exception, but the downloaded file was incomplete, just under 50G.
The code is:
public void download(String url, String filename) throws Exception {
URL dumpUrl = new URL(url);
try (InputStream input = dumpUrl.openStream()) {
Files.copy(input, Paths.get(filename));
}
}
The url is a presigned Google Cloud Storage URL.
Is this just the libraries not detecting a connection reset issue? Or something else?
Are there better libraries I could use. Or do I need to do a HEAD call first and then match downloaded size against content-length.
Don't care that it didn't work, that happens and we have retry logic. My issue is the code thought it did work.
UPDATE: So it seems it failed at exactly 2 hours after starting download. This makes me suspect it may be netops/firewall issue. Not sure at which end, I'll hassle my ops team for starters. Anybody know of time limits at google's end?
Ignore this update - have more instances now, no set time. Anywhere between 20 minutes and 2 hours.
Never resolved core issue. But was able to workaround by comparing the bytes downloaded to the Content-Length header. Work in a loop which resumes incomplete download using the Range header (similar to curl -C -).
Related
I have been searching about this info but since I'm new to web development the answers I'm getting are getting me even more confused.
Basically, I have a webserver established in a Java Modem (which uses 1.3IDE) which will handle requests. These requests were being processed as long as I kept it simple.
http://87.103.87.59/teste.html?a=10&b=10
This request is normally processed.
However, when applying the real deal, my webserver is crashing.
http://5.43.52.4/api.html?ATCOMMAND=AT%5EMTXTUNNEL=SMS,0035111111111,string sending test
The problem is due to two aspects. The "%" character and the string sending test.
To put everything clear, handlers I'm using are these:
public InputStream is = null;
private OutputStream os = null;
private byte buffer[] = new byte[];
String streamAux="";
is = socketX.openInputStream();
os = socketX.openOutputStream();
if ((is.available()>0)||(blockX==true))
{
//Read data sent from remote client
numDadosLidos=is.read(buffer);
for (int i=0;i<numDadosLidos;i++)
streamAux= streamAux + (char)buffer[i]; //where the url will be stored
Basically I will need those parameters so I can use them to operate my Java device so, I think I'll need to do some sort of encoding but there's a lot of information that I can't comprehend and my 1.3 IDE is kind of keeping me stuck.
I apologize for some sort of newbie behaviour in advance.
Hope you can lend me a hand,
Thanks
For those who are interested, I basically went around the issue obliging the message to be sent using '-' character. It doesn't solve the issue, it simply solves the question with the "not-ideal" method.
Still totally interested if someone figures this one out.
Thanks.
The following code is one of the functions in my program:
public void download(String url) throws IOException{
URL website = new URL(url);
ReadableByteChannel rbc = Channels.newChannel(website.openStream());
FileOutputStream fos = new FileOutputStream("test.csv");
fos.getChannel().transferFrom(rbc, 0, Long.MAX_VALUE);
}
Problems:
Sometimes it works and downloads the file.
But sometimes it just downloads the file and nothing inside.
Also sometimes, it will happen to be exception
Exception in thread "main" java.io.IOException: Invalid Http response
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1555)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1441)
at java.net.URL.openStream(URL.java:1038)
at linkproject.Link.download(Link.java:65)
at linkproject.Link.continualDownload(Link.java:158)
at linkproject.Link.main(Link.java:183)
"link project" is just my main program.
PS: In my program, I will change the URL for every 5 seconds. After that I will call the download function. So sometimes it will happen to be exception, I'm very confused about that.
Like my URL here is:
http://www.taifex.com.tw/chinese/3/7_12_8dl.asp?syear=2015&smonth=10&sday=16&eyear=2015&emonth=10&eday=16&COMMODITY_ID=TXF
And every time I change the value of sday, smonth, syear and so on.
I want to figure out why it will occur exception and how can I fix it.
I think I think of a solution to solve that.
Based on all the comments you guys posted, I simply catch the IOException and run the code again until no exception.
Furthermore, once it happens to be IOException then I will sleep for 1 or 2 seconds to avoid many times concurrent connections.
For the problem of "nothing inside", I check the content and if it is null. Then I will download it again util something useful inside.
Turns out it works fine. BTW, thanks all you guys' comments.
i have successfully programmed an app that takes Traces of allot of system services (GPS location, Network Location, wifi, neighbouringcellinfo, sensors ....) in every 10 seconds which works very well, but as soon as i restrict the internet on my phone to use only 2G and turn off Wifi i get the app still works but start to lag.
I have tried to find out where the Problem is coming from and i have noticed that it comes from this code line:
XmlPullParser receivedData = XmlPullParserFactory.newInstance()
.newPullParser().setInput(xmlUrl.openStream(), null); return receivedData;
As soon as i delete this couple of code lines in my activity the app works without lagging, but seeing as they are essential for my app i would very much like to have them work (which they already do) but without causing lags.
Can anyone please help me?
I have printed the parsed result from the XML file and it is correct, so my only problem here is the lagging of the app.
A typical XML file that i would be dealing with looks like this:
<rsp stat="ok">
<cell lat="49.88415658974359" lon="8.637537076923078" mcc="262" mnc="7" lac="41146"
cellid="42404" averageSignalStrength="-79" samples="39" changeable="1"/>
</rsp>
2g is some really slow connection. Even worse is the "warm up" of the antenna. It may last up to 30 seconds, before teh first bit is received. (And there is not really something you can do against this, because it is all about physics).
So the only thing you could do is loading the File in an background-Thread. This will make the appp resonsive (if yo don't need the data in time).
Mayb explicitly a BufferedInputStream
XmlPullParser receivedData = XmlPullParserFactory.newInstance()
.newPullParser().setInput(xmlUrl.openStream(), null);
XmlPullParser receivedData = XmlPullParserFactory.newInstance()
.newPullParser().setInput(
new BufferedInputStream(xmlUrl.openStream()), null);
Maybe, maybe compression
As you know in HTTP a browser may declare in its headers that it can decompress compressed data; and then the server may send a compressed version of the HTML. This serves to put less load on the server side, and may speed up communication, depending.
The same one can do oneself.
For an external uncontrolled site one might try. Send a header
Accept-Encoding: gzip
And one is lucky when receiving a response header:
Content-Encoding: gzip
Doing boing sides oneselfby wrapping the streams:
outputStream = new GZipOutputStream(outputStream);
inputStream = new GZipImüputStream(inputStream);
Saving memory
To make the same string instancees unique reduces memory and might help, even if it costs considerable time itself. String.intern() is bad idea, as prior to java 8, the strings go into the permanent (unrecoverable) memory space. One might use a
private Map<String, String> identityMap = new HashMap<>();
public String unique(String s) {
if (s.length() >= 30) {
return s;
}
String t = identityMap.get(s);
if (t == null) {
t = s;
identityMap.put(s, t);
}
return t;
}
The hope is, that processing becomes faster.
I am going to read from a socket in java. Here is what I am going to do:
System.out.println("Start Reading");
/* bab is socket connector */
/* and readLine is the method below.
/* public String readLine()throws IOException
{
String a = inStream.readLine();
return a;
}
*/
for( int j=0;j<9;j++)
{
response = bab.readLine();
System.out.println(response);
}
I see a lot of delay (2-3 seconds) between printing "start Reading" and first line of the response. But when I requested it with Firefox, it responsed quickly (20 ms). What is the problem? And how can I solve this problem?
I suspect the reason is the server doesn't send the line-delimiter for some time, so the readLine() method waits. I bet if you just do readByte() it must be quick.
As Firefox or any other browser wouldn't read line by line, it dosn't affect them.
Firefox is probably caching the response and is therefore able to display it very quickly to you. I suggest you clear the cache on Firefox and time it again.
If you are using a domain name for the call then Firefox will also cache the DNS lookup which could save time in Firefox whereas making the call in Java could require a DNS lookup.
If you are using Windows then download Fiddler which will allow you to monitor the HTTP connection and give you a better idea of what is happening.
What is the reason for encountering this Exception:
org.apache.commons.fileupload.FileUploadException:
Processing of multipart/form-data request failed. Stream ended unexpectedly
The main reason is that the underlying socket was closed or reset. The most common reason is that the user closed the browser before the file was fully uploaded. Or the Internet was interrupted during the upload. In any case, the server side code should be able to handle this exception gracefully.
Its been about a year since I dealt with that library, but if I remember correctly, if someone tries to upload a file, then changes the browser URL (clicks a link, opens a bookmark, etc) then you could get that exception.
You could possibly get this exception if you're using FileUpload to receive an upload from flash.
At least as of version 8, Flash contains a known bug: The multipart stream it produces is broken, because the final boundary doesn't contain the suffix "--", which ought to indicate, that no more items are following. Consequently, FileUpload waits for the next item (which it doesn't get) and throws an exception.
There is a workaround suggests to use the streaming API and catch the exception.
catch (MalformedStreamException e) {
// Ignore this
}
For more details, please refer to https://commons.apache.org/proper/commons-fileupload/faq.html#missing-boundary-terminator