I have a Java applet that streams video (MJPEG) from a server. I wrote a proxy server in C# (Windows service) to put between the applet and multiple video servers. A HTML/CSS/Js frontend is used along with the Java applet. All functionality works fine (finally!!!), except one thing.
The video server allows you to play back recorded video through a REST interface. When the clip is done, the server leaves the connection open in case you want to send it commands like rewind or seek. The clip is being played fine in the applet until the end. If you try to start a new clip (which entails sending a command from Javscript to the applet), the browser freezes up. However, subsequent commands that would use the same connection work, such as play, pause, and seek. If I stop the windows service, the browser becomes responsive again.
This is what I'm assuming is happening: The clip ends (or is paused); no more data is sent but the connection is still active. The applet is waiting on the proxy for the next frame, but the proxy is waiting on the video server for the next frame, which is not going to send any more data.
This is the code in a while loop that reads each frame
byte[] img = new byte[mContentLength];
inputStream.skipBytes(headerLen);
inputStream.readFully(img);
I need to interrupt this code somehow.
When a new video clip is selected in the HTML frontend, we notify the applet, which calls disconnect() on the CameraStream class. This is that function:
// DataInputStream inputStream
// HttpURLConnection conn
public void disconnect() {
System.out.println("disconnect called.");
if(running) {
running = false;
try {
// close the socket
if(inputStream != null) {
inputStream.close();
}
if(conn != null) {
conn.disconnect();
}
inputStream = null;
System.out.println("closed.");
} catch(Exception ignored) {
System.out.println("exc:" + ignored.getMessage());
main.reportErrorFromThrowable(ignored);
}
}
}
To test this, I let a quick clip play and run to the end. I then select a new clip. In my Java console, I get the output disconnect called. but I don't get the subsequent closed. message, nor does that generic Exception get caught. When I stop the Windows service, I finally get the closed. message, so it seems like inputStream.close(); is blocking.
So I guess my question is how can I stop the blocking? Is the readFully(img) call blocking? Or is it the disconnect function (as suggested by the console output I get)?
edit: just to clarify, I wrote the Java applet, HTML, CSS, Javascript, and C# proxy server, so I have access to all of that code. The only code I can't modify is that of the REST interface on the video server.
edit2: i meant to make bounty for this post https://stackoverflow.com/questions/12219758/proxy-design-pattern
In general, Java I/O methods block. The best solution appears to be to create another thread for reading the data and using NIO buffers. Example of NIO-based read (warning: untested!):
// get the InputStream from somewhere (a queue possibly)
ReadableByteChannel inChannel = Channels.newChannel(inputStream);
ByteBuffer buf = ByteBuffer.allocate(mContentLength + headerLen);
inChannel.read(buf);
byte[] img = new byte[mContentLength];
inChannel.get(img, headerLen, mContentLength);
This code creates a Channel from the InputStream and uses the Channel to read data. The JavaDoc for the ReadableByteChannel.read(ByteBuffer) function says that interrupting the thread that contains the call to inChannel.read(buf) will stop the read.
You will have to adapt this code, I just pulled it out of my head. Good luck!
I finally figured out the answer:
public void disconnect() {
if(running) {
running = false;
try {
try{
// had to add this
conn.getOutputStream().close();
}
catch(Exception exc){
}
// close the socket
if(inputStream != null) {
inputStream.close();
}
if(conn != null) {
conn.disconnect();
}
inputStream = null;
} catch(Exception ignored) {
main.reportErrorFromThrowable(ignored);
}
}
}
Even though I'm using an HttpUrlConnection, which is one way and doesn't have an output stream, trying to close the output stream raised an exception and for some reason made it all work.
Related
I am working on an Android app which has an i/o streams opened to a pc app, I want to keep an eye on the inputStream and get notified when new data comes from the pc side. Currently I am while(true)-ing {istream.read()} and once it does not through an error I know that something new just arrived so I do the propper action. Is there a better way to keep listening to the inputStream? Is this a correct way for listening?
//assume we have inputStream instance
while(true) {
val bytes = ByteArray(size)
try {
inputStream.read(bytes, 0, size)
//consume the bytes!
}catch(e: Exception) {
//do nothing because nothing arrived
}
}
How to close java client socket correctly?
is it necessary to close the socket.getOutputStream?
is it necessary to close the socket.getInputStream?
is it necessary to call socket.shutdownInput () ?
is it necessary to call socket.shutdownOutput () ?
What should be the sequence of calls (before|after) socket.close()?
The Socket documentation states:
Closing this socket will also close the socket's InputStream and OutputStream.
You don't have to shutdown the input/output. However that does allow you to "half" close the socket. Say if you wanted to continue to send data, but want to indicate you will no longer receive it.
So in short; It's completely fine to do the following:
...
finally {
if (socket != null)
socket.close();
}
I need to know what could cause the InputStream to hang indefinitely on close. Here's my code.
URL url = new URL("ftp://..");
InputSream is = url.openStream();
BufferedReader reader = new InputStreamReader(new GZIPInputStream(is));
try{
while(true){
if(reader.readline() == null){
break;
}
}
}catch(Exception e){
e.printStackTrace();
} finally{
System.out.println("Closing reader");
is.close(); // sometimes hangs indefinitely
System.out.println("Reader closed");
}
Closing InputStream or BufferedReader has the same effect.
I need to know what could cause the InputStream to hang indefinitely on close
I think it is the nature of the particular input stream that you are using.
You have a stream open to a read a file from an FTP server. Depending on the protocol driver that is being used on the Java client side, when you close() the file it may be attempting to close an active FTP session. This could involve exchanging network packets with the remote server, and if there is a network problem or the server has died, then that could take "a long time" ... depending on how long network timeouts have been set to, etc.
It could also be something like this:
Java HttpURLConnection InputStream.close() hangs (or works too long?)
My recommendation you be to capture a thread stacktrace while a connection is hung in close() and then delve into the Java codebase to figure out where exactly it is hung. (There are too many possible places for a problem to occur to investigate this without concrete evidence.)
Also try removing the GZIPInputStream from the stack to see if that makes any difference.
Finally, if you come back to us with more evidence, please tell us the precise Java version you are using and the real URL of the FTP server.
I am currently implementing a web proxy but i have run into a problem.I can parse my request from the browser and make a new request quite alright but i seem to have a problem with response.It keeps hanging inside my response loop
serveroutput.write(request.getFullRequest());
// serveroutput.newLine();
serveroutput.flush();
//serveroutput.
//serveroutput.close();
} catch (IOException e) {
System.out.println("Writting tothe server was unsuccesful");
e.printStackTrace();
}
System.out.println("Write was succesful...");
System.out.println("flushed.");
try {
System.out.println("Getting a response...");
response= new HttpResponse(serversocket.getInputStream());
} catch (IOException e) {
System.out.println("tried to read response from server but failed");
e.printStackTrace();
}
System.out.println("Response was succesfull");
//response code
public HttpResponse(InputStream input) {
busy=true;
reader = new BufferedReader(new InputStreamReader(input));
try {
while (!reader.ready());//wait for initialization.
String line;
while ((line = reader.readLine()) != null) {
fullResponse += "\r\n" + line;
}
reader.close();
fullResponse = "\r\n" + fullResponse.trim() + "\r\n\r\n";
} catch (IOException`` e) {
e.printStackTrace();
}
busy = false;
}
You're doing a blocking, synchronous read on a socket. Web servers don't close their connections after sending you a page (if HTTP/1.1 is specified) so it's going to sit there and block until the webserver times out the connection. To do this properly you would need to be looking for the Content-Length header and reading the appropriate amount of data when it gets to the body.
You really shouldn't be trying to re-invent the wheel and instead be using either the core Java provided HttpURLConnection or the Appache HttpClient to make your requests.
while (!reader.ready());
This line goes into an infinite loop, thrashing the CPU until the stream is available for read. Generally not a good idea.
You are making numerous mistakes here.
Using a spin loop calling ready() instead of just blocking in the subsequent read.
Using a Reader when you don't know that the data is text.
Not implementing the HTTP 1.1 protocol even slightly.
Instead of reviewing your code I suggest you review the HTTP 1.1 RFC. All you need to do to implement a naive proxy for HTTP 1.1 is the following:
Read one line from the client. This should be a CONNECT command naming the host you are to connect to. Read this with a DataInputStream, not a BufferedReader, and yes I know it's deprecated.
Connect to the target. If that succeeded, send an HTTP 200 back to the client. If it didn't, send whatever HTTP status is appropriate and close the client.
If you succeeded at (2), start two threads, one to copy all the data from the client to the target, as bytes, and the other to do the opposite.
When you get EOS reading one of those sockes, call shutdownOutput() on the other one.
If shutdownOutput() hasn't already been called on the input socket of this thread, just exit the thread.
If it has been called already, close both sockets and exit the thread.
Note that you don't have to parse anything except the CONNECT command; you don't have to worry about Content-length; you just have to transfer bytes and then EOS correctly.
I have a list of feeds in a database that I use to download a XML file from a FTP server and then parse it. The scrpt is bundled up into a jar file which is run daily using Windows Task Scheduler. Occasionally the request get haulted at grabbing a certain xml file. So far it has happened about 3 times in 2 weeks with no real pattern that I can see.
When it does mess up, I go to the computer it is being run from, I see the command window open and it is stopped before the xml has been fully downloaded. If I close the command window and run the task manually everything will work fine.
The code that I am using to download the xml file is:
private void loadFTPFile(String host, String username, String password, String filename, String localFilename){
System.out.println(localFilename);
FTPClient client = new FTPClient();
FileOutputStream fos = null;
try {
client.connect(host);
client.login(username, password);
String localFilenameOutput = createFile(assetsPath + localFilename);
fos = new FileOutputStream(localFilenameOutput);
client.retrieveFile(filename, fos);
} catch (IOException e) {
e.printStackTrace();
} finally {
try {
if (fos != null)
fos.close();
client.disconnect();
} catch (IOException e) {
e.printStackTrace();
}
}
}
This function is being called in a loop and when it fails, everything stops and the script doesn't go onto the next feed.
I'm not sure what is happening, possibly the connection being lost, but I would think that the try/catch would catch if that is happening. I'm not sure if a timeout would do the trick or threads need to be used (but I've never worked with threads)
Could anyone point me in the right direction onto why this is happening and what I can do to fix the problem
UPDATE - Set a timeout for the data connection
Since the last file is only partially downloaded, and given the source of FTPClient.retrieveFile(), I think it may be a problem on the server side (something that make it hang, or even die - who knows). Obviously one can't repair the server or even know what's going on there, anyway I suggest to add a timeout with setDataTimeout(int) and catch the possible SocketTimeoutException separately to be logged in a different place and maybe sent to the FTP server admins (along with the time information when it happened) so they can merge the logs and see what's the issue.
OLD ANSWER
I didn't notice that you connect and login for each and every file, so the following is just an optimization not to close the control connection and succesfully logout, but it should not address the problem.
You could start the JVM in debug mode and attach a debugger when it hangs, anyway according to this answer and this thread it can be a timeout problem on the network equipment devices (routers). From the FTPClient Javadoc
During file transfers, the data connection is busy, but the control
connection is idle. FTP servers know that the control connection is in
use, so won't close it through lack of activity, but it's a lot harder
for network routers to know that the control and data connections are
associated with each other. Some routers may treat the control
connection as idle, and disconnect it if the transfer over the data
connection takes longer than the allowable idle time for the router.
One solution to this is to send a safe command (i.e. NOOP) over the control connection to reset the router's idle timer. This is enabled as follows:
ftpClient.setControlKeepAliveTimeout(300); // set timeout to 5 minutes
Do you check the return status of any of the calls or is that the code?
There is a call completePendingCommand() that has to be used on occassion. That may be something to look into.
Also, you won't see an IO exception, I belive it gets repackaged as a CopyStreamException
You might want to also change the return value to a boolean since you trap the exceptions, at least the calling loop will know whether the tranfer happened or not.
private boolean loadFTPFile(String host, String username, String password, String filename, String localFilename){
System.out.println(localFilename);
FTPClient client = new FTPClient();
FileOutputStream fos = null;
try {
client.connect(host);
int reply = client.getReplyCode();
if (!FTPReply.isPositiveCompletion(reply)){
client.disconnect();
System.err.println("FTP server refused connection.");
return false;
}
if (!client.login(username, password)){
client.logout();
return false;
}
String localFilenameOutput = createFile(assetsPath + localFilename);
fos = new FileOutputStream(localFilenameOutput);
boolean result = client.retrieveFile(filename, fos);
SimpleDateFormat sdf = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.SSS");
if (result){
System.out.println("\tFile Transfer Completed Successfully at: " + sdf.format(Calendar.getInstance().getTime()));
// ftp.completePendingCommand();
}
else {
System.out.println("\tFile Transfer Failed at: " + sdf.format(Calendar.getInstance().getTime()));
}
return result;
}catch (CopyStreamException cse){
System.err.println("\n\tFile Transfer Failed at: " + sdf.format(Calendar.getInstance().getTime()));
System.err.println("Error Occurred Retrieving File from Remote System, aborting...\n");
cse.printStackTrace(System.err);
System.err.println("\n\nIOException Stack Trace that Caused the Error:\n");
cse.getIOException().printStackTrace(System.err);
return false;
}catch (Exception e){
System.err.println("\tFile Transfer Failed at: " + sdf.format(Calendar.getInstance().getTime()));
System.out.println("Error Occurred Retrieving File from Remote System, aborting...");
e.printStackTrace(System.err);
return false;
} finally {
try {
if (fos != null)
fos.close();
client.disconnect();
} catch (IOException e) {
e.printStackTrace();
}
}
}
It's not a threading issue. Chances are it is caused by something in the loop since that code looks like it should clean up just fine. That said, for testing you will probably want to add
catch (Exception e) {
e.printStackTrace();
}
after the IOException catch clause. It's possible that another exception is being thrown.
Another thing, if you are pulling results from the database result set one at a time and doing the FTP gets, that might be a problem. Unless the results are all brought back by the JDBC call at once, that too could time out. Not all database queries actually return the entire result set to the client at once.