implementing JTimer in existing code - java

How can I implement a 60 second timeout in this code?
This code is opening a URL, downloading plain text, and sending output as a string variable.
Works, but sometimes hangs, and I have to start all over again.
I was hoping something which would timeout after 60 seconds and return whatever data is retrieved.
Please don't suggest to use external libs like Apache etc. If I could edit this code itself, then that would be better.
public static String readURL( URL url )
{
try
{
// open the url stream, wrap it an a few "readers"
BufferedReader reader = new BufferedReader(new InputStreamReader(url.openStream()));
String s="";
String line="";
while ((line = reader.readLine()) != null)
{
s=s+"\r\n"+line;
}
reader.close();
return s;
}catch(Exception e){ StringWriter errors = new StringWriter(); e.printStackTrace(new PrintWriter(errors)); return errors.toString(); }
}//end method

Thread.sleep(60000);
Above code will let the Thread sleep for 60 seconds, and do nothing during that time.
If you want to change the timeout of your connection; Look at Is it possible to read from a InputStream with a timeout?

Related

How to fix stream error: may fail to close stream error

I am writing a method but see this error: may fail to close stream.
According to some solutions on different posts, I have added try and catch within the finally block. I also added IOUtils.closeQuietly(fullObject, (Log) LOGGER). But it still doesn't work. Anyone can help take a look? Thanks!
S3Object fullObject = null;
StringBuffer buffer = new StringBuffer();
try {
S3Object s3Response = s3Client.getObject(s3BucketName, s3Key);
BufferedReader reader = new BufferedReader(new InputStreamReader(s3Response.getObjectContent()));
String line;
while ((line = reader.readLine()) != null) {
buffer.append(line);
}
} finally {
if (fullObject != null) {
try {
fullObject.close();
} catch (IOException ex) {
throw new RuntimeException(ex);
}
IOUtils.closeQuietly(fullObject, (Log) LOGGER);
}
}
return buffer.toString();
}
You should be using Java 7+ try with resources. It will take care of closing the resources you declare in the list. Any exceptions that may be thrown in the process of closing will be dealt with appropriately. (They are either allowed to propagate, or they are "suppressed" if an exception was already propagating.)
Your code using try with resources would look like this. It is half the length of the original version AND it won't have any resource leaks. You "win" both ways.
try (S3Object s3Response = s3Client.getObject(s3BucketName, s3Key);
BufferedReader reader = new BufferedReader(
new InputStreamReader(s3Response.getObjectContent()));
)
{
StringBuffer buffer = new StringBuffer();
String line;
while ((line = reader.readLine()) != null) {
buffer.append(line);
}
return buffer.toString();
}
Notice that I have gotten rid of fullObject which your code wasn't using.
There are actually two managed resources in the above: the s3Response and the reader. It might not be strictly necessary to close both, but (IMO) closing them anyway is the correct thing to do ... from the perspective of readability, if nothing else.
(It may also be possible to do the "read content as a string" more simply and/or more efficiently, but that is outside of the scope of this question.)
InputStreamReader implements AutoCloseable. This means that the intended use is try-with-resources:
try (InputStreamReader reader = new InputStreamReader(s3Response.getObjectContent()) {
...
}
This should always close the stream irrespective of how the block exits (i.e. through normal completion, catch or finally clauses).
The same is true for S3Object and BufferedReader. They can all be declared as resources within the same try block.
See https://docs.oracle.com/javase/tutorial/essential/exceptions/tryResourceClose.html for more details.

How to use BufferedReader correctly with sockets

The application is a basic chat client.
I got a Thread for getting data from the server.
I want to get every response from the server separately.
It prints in the console only when the loop breaks (when i send "exit" using the other parts of the application).
So when "System.out.println" responds it prints the whole buffer at once.
How can i make it work and print every response separately?
Thank you!
EDIT!!
The server respond should include "\n" after each line,
it works for me in this way.
Without "\n" it just waits until the loop breaks.
Is there a better way to do this without the "\n" issue?
class ServerThread implements Runnable {
#Override
public void run(){
BufferedReader in = null;
Socket socket;
try
{
if (Thread.currentThread().isAlive()) {
sendString("exit");
Thread.currentThread().interrupt();}
InetAddress serverAddress = InetAddress.getByName(SERVER_IP);
socket = new Socket(serverAddress, SERVER_PORT);
outr = new PrintWriter(new BufferedWriter(new OutputStreamWriter(socket.getOutputStream())),true);
in = new BufferedReader(new InputStreamReader(socket.getInputStream()));
String serverResponse;
while((serverResponse = in.readLine()) != null)
{
System.out.println(serverResponse);
}
in.close();
outr.close();
socket.close();
}
catch (IOException e)
{
e.printStackTrace();
}
}
}
You're using a BufferedReader and read it with in.readLine(), which, surprise surprise, will return the next line in the response. A line is a string terminated by a newline character, so your BufferedReader will have to wait until it sees a newline until it can return your line. If you don't want to use newlines in your response, don't use readLine(), use one of the other read() methods instead.

Building Java server and I can't get my page to stop loading (using PrintWriter and Buffered Reader)

I'm building a Java server and everything has been working as expected until now. I can serve up a static html page using two methods I wrote: body and header. Now, I am trying to write a new method called "bodywithQueryString".
Problem:
It almost works, but after the page is loaded, the loading won't stop. It just loads and loads. This is not happening with my static pages.
The only difference between the old method and new bodyWithQueryString() method is that in the new method I am using a buffered reader and print writer. These are new-ish functions for me so I'm guessing I'm not doing it right.
Here's how my new method is supposed to function:
I want to pass my route and querystring (queryarray) to bodyWithQueryString method. I want the method to read the file (from the route) to a byte output stream, do a replaceall on the key/value pair of the querystring while reading and, lastly, return the bytes. The getResponse() main method would then send the html to the browser.
Here's my code:
public void getResponse() throws Exception {
String[] routeParts = parseRoute(route); //break apart route and querystring
File theFile = new File(routeParts[0]);
if (theFile.canRead()) {
out.write(header( twoHundredStatusCode, routeParts[0], contentType(routeParts[0]) ) );
if (routeParts.length > 1) { //there must be a querystring
String[] queryStringArray = parseQueryString(routeParts[1]); //break apart querystring
out.write(bodyWithQueryString(routeParts[0], queryStringArray)); //use new body method
}
else out.write(body(routeParts[0])); //use original body method
out.flush();
private byte[] bodyWithQueryString(String route, String[] queryArray)
throws Exception {
BufferedReader reader = new BufferedReader(new FileReader(route));
ByteArrayOutputStream fileOut = new ByteArrayOutputStream();
PrintWriter writer = new PrintWriter(fileOut);
String line;
while ((line = reader.readLine()) != null) writer.println(line.replaceAll(queryArray[0] ,queryArray[1]));
writer.flush();
writer.close();
reader.close();
return fileOut.toByteArray();
}
It seems to me that you are not returning Content-Length header. This makes it hard for browser know when to stop loading the response.

extract specific part of html code

I am doing my first Android app and I have to take the code of a html page.
Actually I am doing this:
private class NetworkOperation extends AsyncTask<Void, Void, String > {
protected String doInBackground(Void... params) {
try {
URL oracle = new URL("http://www.nationalleague.ch/NL/fr/");
URLConnection yc = oracle.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(yc.getInputStream()));
String inputLine;
String s1 = "";
while ((inputLine = in.readLine()) != null)
s1 = s1 + inputLine;
in.close();
//return
return s1;
}
catch (IOException e) {
e.printStackTrace();
}
return null;
}
but the problem is it takes too much time. How to take for exemple the HTML from the line 200 to the line 300 ?
Sorry for my bad english :$
Best case use instead of readLine() use read(char[] cbuf, int off, int len). Another dirty way
int i =0;
while(while ((inputLine = in.readLine()) != null)
i++;
if(i>200 || i<300 )
DO SOMETHING
in.close();)
You get the HTML document through HTTP. HTTP usually relies on TCP. So... you can't just "skip lines"! The server will always try to send you all data preceding the portion of your interest, and your side of communication must acknowledge the reception of such data.
Do not read line by line [use read(char[] cbuf, int off, int len)]
Do not concat Strings [use a StringBuilder]
Open The buffered reader (much like you already do):
URL oracle = new URL("http://www.nationalleague.ch/NL/fr/");
BufferedReader in = new BufferedReader(new InputStreamReader(oracle.openStream()));
Instead of reading line by line, read in a char[] (I would use one of size about 8192)
and than use a StringBuilder to append all the read chars.
Reading secific lines of HTML-source seams a little risky because formatting of the source code of the HTML page may change.

Java - scanner doesn't pause when no input

I am running a client and servlet under jetty locally. When I read the message in the client, I do:
in = new Scanner(conn.getInputStream());
StringBuffer messageBuffer = new StringBuffer();
while (in.hasNext()) {
messageBuffer.append(in.next()).append(" ");
}
and I expect that when there is no data coming from the servlet, it should freeze at
while (in.hasNext())
instead, I just end up with empty messageBuffer, and i have to deal with it and call the method again and again until I get a message. Why is this happening? How can I make it stop at the while statement and wait until there is data coming in?
Here is how the url connection is started(once, in client constructor):
try {
url = new URL("http://localhost:8182/stream");
conn = (HttpURLConnection) url.openConnection();
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException ioE) {
ioE.printStackTrace();
}
from the Scanner doc:
The next() and hasNext() methods and their primitive-type companion
methods (such as nextInt() and hasNextInt()) first skip any input that
matches the delimiter pattern, and then attempt to return the next
token. Both hasNext and next methods may block waiting for further
input. Whether a hasNext method blocks has no connection to whether or
not its associated next method will block.
it says that it might block, but it is not part of the api, it depends on the underline implementation according to what you scan.
anyway, you need to implement the wait yourself by something like:
while (!in.hasNext() && !stop){
sleep();
}
The javadoc states that Scanner.hasNext() and Scanner.next() may block. It really depends on the underlying InputStream. I personally wouldn't ever use a Scanner to read from a socket if that's what that is.
A more sane approach is probably to use an an InputStreamReader wrapped by a BufferedReader. Also worth mentioning is that you should be using StringBuilder rather than StringBuffer unless you need thread safety.
BufferedReader br =
new BufferedReader(new InputStreamReader(conn.getInputStream()));
StringBuilder sb = new StringBuilder();
String input = null;
while ((input = br.readLine()) != null)
{
sb.append(input).append(" ");
}
Note this is using readLine() which may or may not suit your needs depending on what you're receiving. It also assumes the other end of the connection is going to close when it's done sending. You may want to use one of the read() methods instead and parse accordingly.
Edit to add from comments below: This is literally how blocking reads work in Java. Here's a complete example:
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.MalformedURLException;
import java.net.URL;
public class App
{
public static void main( String[] args ) throws MalformedURLException, IOException
{
URL url = new URL("http://www.google.com");
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
BufferedReader br =
new BufferedReader(new InputStreamReader(conn.getInputStream()));
StringBuilder sb = new StringBuilder();
String input = null;
while ((input = br.readLine()) != null)
{
sb.append(input).append(" ");
}
System.out.println(sb.toString());
}
}
Output (cut off for brevity here, but it's the entire page):
<!doctype html><html itemscope="itemscope" itemtype="http://schema.org/WebPage"><head> ...
In my experience, I have always tried to avoid blocking and therefore have avoided using Scanner methods for hasNext().
Instead, I have used the InputStream. InputStream, which you have used in the code:
conn.getInputStream();
You can then use the InputStream method available() to see if any information has passed through.
So for your code, I would recommend:
InputStream inStream = conn.getInputStream();
in = new Scanner(inStream);
StringBuffer messageBuffer = new StringBuffer();
while (in.available() <= 0)
{
try{
Thread.sleep(100);
}
catch(InterruptedException e){}
}
messageBuffer.append(in.next()).append(" ");
Edited :
try using for loop instead of while loop, where the range will be from 0 to (length -1). you will have to find length using while(in.hasNextLine()) and put length counter inside it .Dont forget to close the scanner after you are done with filling the messageBuffer (using .close()). just try it out. It should work.

Categories

Resources