establish http request with Timed out - Java - java

I'm looking for a way to establish an HTTP Request via java to ensure the server is alive.
for example I want to scan ip addresses range 192.168.1.1-255 and print a log with the living server.,
I want to setTimeOut for 3Seconds when the HTTP response is delayed for some reason.
I have tried to do it this way:
try {
Socket s = new Socket(InetAddress.getByName("192.168.1.2"), 80);
s.setSoTimeout(3 * 1000);
PrintWriter pw = new PrintWriter(s.getOutputStream());
pw.println("GET / HTTP/1.1");
pw.println("Host: stackoverflow.com");
pw.println("");
pw.flush();
BufferedReader br = new BufferedReader(new InputStreamReader(s.getInputStream()));
String t;
while ((t = br.readLine()) != null) System.out.println(t);
br.close();
} catch (SocketTimeoutException e) {
System.out.println("Server is dead.");
} catch (ConnectException e) {
System.out.println("Server is dead.");
}
But it's seem to be not waiting at all when the request is taking longer than 3000millis.
Thanks!

I think you confused the different timeouts. If you want to abort the connection attempt after three seconds without any response, then you should establish the connection as follows:
Socket clientSocket = new Socket();
clientSocket.connect(new InetSocketAddress(target, 80), 3 * 1000);
where target is any IP address. The following line essentially set the timeout value for reading/waiting for the inputstream -after the connection was established. So it has not effect on establishing the connection itself. However, after the connection was established it would interrupt the "read inputstream" step after three seconds (by throwing an exception).
clientSocket.setSoTimeout(3 * 1000);
However, if you want to limit also the time for reading the inputstream without throwing an exception, then you need a costum solution:
Is it possible to read from a InputStream with a timeout?
The following running example worked very well in my local network. It tried to connect for at most three seconds and detected all running webservers.
public class Main {
public static void main(String[] args) {
String net = "192.168.168."; // this is my local network
for (int i = 1; i < 255; i++) { // we scan the range 1-255
String target = net + i;
System.out.println("Try to connect to: " + target);
try {
Socket clientSocket = new Socket();
// we try to establish a connection, timeout is three seconds
clientSocket.connect(new InetSocketAddress(target, 80), 3 * 1000);
// talk to the server
PrintWriter out = new PrintWriter(clientSocket.getOutputStream());
out.println("GET / HTTP/1.1");
out.println("Host: stackoverflow.com");
out.println("");
out.flush();
BufferedReader br = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()));
String t;
while ((t = br.readLine()) != null) System.out.println(t); // We print the answer of the server
br.close();
clientSocket.close();
// server seems to be alive
System.out.println("> Server is alive");
} catch (SocketTimeoutException | ConnectException e) {
System.out.println("> Server is dead");
} catch (Exception e) { // This is not nice but this is also just a demo
e.printStackTrace();
}
}
}
}
Output (excerpt):
Try to connect to: 192.168.168.1
> Server is dead
Try to connect to: 192.168.168.2
> Server is dead
...
Try to connect to: 192.168.168.23
(answer of the server)
> Server is alive
...

Related

first message over socket lost in Eclipse

I have this funny behavior in debug mode which I can't explain to myself.
There are two small client and server applications, and the flow goes like this:
Client initiates TCP connection
Server accepts and sends message to client requesting packet data size(it's fixed)
Client sends the size
Server reads the size and initializes some byte array - this works
Server blocks on read() from the input stream waiting on packets and client sends packets. - this is culprit
It all works when i run the applications, however when I debug the Server - it simply blocks on read().
If i send more than one message, for example 50, the Server receives 49 of those. Once client closes connection, Server reads -1 from stream and exits, with the first message lost.
The only thing i can think of is that in debug mode, client sends message before server reads it from stream but i don't see how could that be relevant as the message should be there.
Could somebody explain this behavior i am experiencing in debug mode only?
Note: client code is there just for debug, so it's not the prettiest.
Server
private static int sent;
public void processClientInput(HealthCheckSession session) {
byte[] buffer = session.getBuffer();
DataInputStream stream = session.getInFromClient();
int readBytes = 0;
int totalBytes = 0;
try {
while((readBytes = stream.read(buffer)) !=-1) { <---- this blocks in debug on first message
totalBytes += readBytes;
if (totalBytes < buffer.length) {
continue;
}
else {
totalBytes = 0;
String packet = new String(buffer, ProtocolValidator.SERIALIZATION_CHARSET);
sendResponse(packet);
}
}
} catch (IOException e) {e.printStackTrace();}
}
private void sendResponse(String packet) {
System.out.println(packet);
++sent;
System.out.println("sent " + sent);
}
Client
public static void main(String[] args) throws UnknownHostException,
IOException, InterruptedException {
Socket clientSocket = new Socket("localhost", 10002);
// get the socket's output stream and open a PrintWriter on it
PrintWriter outToServer = new PrintWriter(
clientSocket.getOutputStream(), true);
// get the socket's input stream and open a BufferedReader on it
BufferedReader inFromServer = new BufferedReader(new InputStreamReader(
clientSocket.getInputStream()));
System.out.println("before read");
System.out.println(inFromServer.readLine());
System.out.println("after read");
byte[] packageData = "100, ABCDZEGHR"
.getBytes(Charset.forName("UTF-8"));
String size = String.valueOf(packageData.length);
outToServer.println(size);
for (int i = 0; i < 50; i++) {
DataOutputStream dis = new DataOutputStream(
clientSocket.getOutputStream());
System.out.println("before write");
dis.write(packageData);
dis.flush();
System.out.println("after write");
Thread.sleep(1000);
}
try {
Thread.sleep(100 * 1000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
clientSocket.close();
}
IMPORTANT UPDATE
This does not happen in Netbeans debug mode.

Read multiple lines using BufferedReader (Socket)

I already read some threads here on stackoverflow, also some tutorials, but I don't find a solution to my problem.
I have Java client which connects to a server, then sends exactly one line to the server, and I get 2 or 3 lines as a response.
Here is my code:
public static void main(String[] args) {
String message;
String response;
try {
BufferedReader inFromUser = new BufferedReader( new InputStreamReader(System.in));
Socket clientSocket = new Socket(hostname, port);
DataOutputStream outToServer = new DataOutputStream(clientSocket.getOutputStream());
BufferedReader inFromServer = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()));
message = inFromUser.readLine();
outToServer.writeBytes(message + '\n');
// here my program "freezes"
while ((response = inFromServer.readLine()) != null) {
System.out.println("response: " + response);
}
clientSocket.close();
} catch (UnknownHostException e) {
System.out.println("Unknown Host");
} catch (IOException e) {
System.out.println("IO Exception");
}
}
My problem is, I can read every line of the response, but my program won't exit. The line clientSocket.close(); gets never called. What am I doing wrong?
Presumably your server isn't closing the connection - therefore the underlying stream for the reader isn't closed... at any point the server could send more information. readLine() only returns null when the stream has been closed, i.e. there will definitely not be any more data.
Now we don't know anything about the protocol here, but if the expected behaviour is that the client won't send any more information, and the server will close the connection, then the bug is in the server. If the protocol states that the server will keep the connection open, then the bug is in your client code and you need to work out how to detect the end of data (or send some sort of ack that will cause the server to close the connection, or whatever).

Prevent server terminating in client (golang) server (Java) application

I have a simple echo-server in Java:
int portNumber = 4444;
try (
ServerSocket serverSocket =
new ServerSocket(Integer.parseInt(args[0]));
Socket clientSocket = serverSocket.accept();
PrintWriter out =
new PrintWriter(clientSocket.getOutputStream(), true);
BufferedReader in = new BufferedReader(
new InputStreamReader(clientSocket.getInputStream()));
) {
String inputLine;
while ((inputLine = in.readLine()) != null) {
out.println(inputLine);
System.out.println(inputLine);
}
} catch (IOException e) {
System.out.println("Exception caught when trying to listen on port "
+ portNumber + " or listening for a connection");
System.out.println(e.getMessage());
}
and a simple golang client:
func main() {
fmt.Println("start client")
conn, err := net.Dial("tcp", "localhost:4444")
if err != nil {
log.Fatal("Connection error", err)
}
conn.Write([]byte("hello world"))
conn.Close()
fmt.Println("done")
}
When I start the server and then run the client, the server echo's "hello world" as expected but then the server exits/terminates.
Q. How do I prevent this Java termination and force the server to continually wait for more client requests?
When the client terminates, the readLine on server side will result in the end of the stream. So if you want the server to continuously listen for new connections the simply put the above server code in a endless loop.
e.g.
while (true) {
// above code
}
For a play application that would be adequate.

TCP Server response timeout

I have been tasked with getting a simple TCP Client to timeout. The client works as expected, however I cannot seem to get the client to timeout when the client does not receive an input for 3 seconds or more.
I have a basic understanding of SO_TIMEOUT, but can't get it to work here.
Please help
Here is my code:
TCPClient
private static final String host = "localhost";
private static final int serverPort = 22003;
public static void main(String[] args) throws Exception
{
try
{
System.out.println("You are connected to the TCPCLient;" + "\n" + "Please enter a message:");
BufferedReader inFromUser = new BufferedReader(new InputStreamReader(System.in));
#SuppressWarnings("resource")
Socket client = new Socket(host, serverPort);
client.setSoTimeout(3000);
while(true)
{
DataOutputStream outToServer = new DataOutputStream(client.getOutputStream());
BufferedReader inFromServer = new BufferedReader(new InputStreamReader(client.getInputStream()));
String input = inFromUser.readLine();
outToServer.writeBytes(input + "\n");
String modedInput = inFromServer.readLine();
System.out.println("You Sent: " + modedInput);
try
{
Thread.sleep(2000);
}
catch(InterruptedException e)
{
System.out.println("Slept-in");
e.getStackTrace();
}
}
}
catch(SocketTimeoutException e)
{
System.out.println("Timed Out Waiting for a Response from the Server");
}
}
setSoTimeout doesn't do what you think it does. From the Javadoc:
With this option set to a non-zero timeout, a read() call on the
InputStream associated with this Socket will block for only this
amount of time.
It's a timeout for reads from the socket, so reads() will return after 3 seconds even if there's no data. It's not a timeout for socket inactivity - i.e. the socket won't disconnect after being idle for 3 seconds.

Timeout on a java util scanner?

I have the following problem, I have a simple TCP class in my application that sends a message off to a device for a query, the device then responds with the message however there is no end of line character of any description because it is coming from a serial converter, after initially atempting to use the readline function and discovering it requires the eol character before outputting I have tried the scanner function which works fine unless the device doesnt reply to that request for some reason, my application then freezes, is it possible to set a timeout on the scanner function so that it then drops the connection and moves on or is there a better way to do this? my code is below:
public String Send_TCP ( InetAddress IPAddress, int POrt, String InData) throws IOException
{
Socket socket = null;
PrintWriter out = null;
BufferedReader in = null;
try {
socket = new Socket(IPAddress, POrt);
out = new PrintWriter(socket.getOutputStream(), true);
in = new BufferedReader(new InputStreamReader(socket.getInputStream()));
} catch (UnknownHostException e) {
System.err.println("Don't know about host");
System.exit(1);
} catch (IOException e) {
System.err.println("Couldn't get I/O for the connection");
System.exit(1);
}
BufferedReader read = new BufferedReader(new InputStreamReader(System.in));
;
System.out.print("Connected, Sending:"+ InData);
out.println(InData);
System.out.println("Equals");
String str1 = new Scanner(in).useDelimiter(">").next() + ">";
System.out.println(str1);
System.out.println("Equals");
out.close();
in.close();
read.close();
socket.close();
return str1;
}
}
I'm not sure that I understand your question correctly but you can set a timeout on the socket: socket.setSoTimeout(int timeout).
See: javadoc
I believe the following achieves what I need it to, basically checking if the buffer exists, if it doesnt then it waits and checks again avoiding the trap of the scanner function if the message never arrives if it does it reads it.
try {
BufferedReader rd = new BufferedReader(new InputStreamReader(sock.getInputStream()));
int count = 1;
do {
if (rd.ready()){
System.out.println ("Response Ready");
str = new Scanner(rd).useDelimiter(">").next()+">";
count = 501;
}
Thread.sleep(10);
System.out.println ("Response Not Ready" + count);
count ++;
} while (count < 25);

Categories

Resources