I have created a web application, and hosted in on the server. Now I want to create a java program which will access (or "hit") the URL of my application in a loop, so that I can check how much load can my web application can handle. Also, the program should be able to tell me when the URL was successfully accessed, and when it wasn't.
I tried executing it without using a loop:
try {
URL url = new URL("https://example.com/");
BufferedReader in = new BufferedReader(
new InputStreamReader(url.openStream()));
String inputLine;
while ((inputLine = in.readLine()) != null) {
System.out.println(inputLine);
}
in.close();
} catch (Exception e) {
System.out.println("e: " + e.toString());
}
But, I got this error:
e: javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: No subject alternative DNS name matching example.com found.
Use,
javax.net.ssl.HttpsURLConnection
to connect to https://
something like (note, handle resource closing, exceptions etc left on you)
final URL url = new URL("https://example.com");
final HttpsURLConnection con = (HttpsURLConnection)url.openConnection();
final BufferedReader br = new BufferedReader(new InputStreamReader(con.getInputStream()));
String input;
while ((input = br.readLine()) != null){
System.out.println(input);
}
However there are lots of tool available to load test it
you are just getting a stream from url, but obviously this url is using HTTPS, so you need a "Public Key" imported to your client application. otherwise the client and server won't be able to communicate with each other.
Related
I am starting a tomcat server in my local for a web application and it takes around 20 minutes to be up and running. I want to check if the web app is up and running and taking any requests via java. Any help?
My server is say at localhost:8001/myapp
Thanks in advance.
You can check it through many ways. Like... Set a servlet as start up on-load and inside it keep some loggers which files log messages along with exact time.
You can add something like localhost:8001/myapp/status to the app that would return information about current status. Then you can just sent http request from java and check the response
public String execute(String uri) throws Exception {
URL url = new URL(uri);
URLConnection connection = url.openConnection();
connection.setReadTimeout(1000);
BufferedReader in = new BufferedReader(
new InputStreamReader(
connection.getInputStream()));
String inputLine;
StringBuffer outputLine = new StringBuffer();
while ((inputLine = in.readLine()) != null)
outputLine.append(inputLine);
in.close();
return outputLine.toString();
}
I guess I will call this method after a certain time period to see if I'm getting a timeout exception of the raw html.
I am trying to grab a site's source code using this code
private static String getUrlSource(String url) throws IOException {
URL url = new URL(url);
URLConnection urlConn = url.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(
urlConn.getInputStream(), "UTF-8"));
String inputLine;
StringBuilder a = new StringBuilder();
while ((inputLine = in.readLine()) != null)
a.append(inputLine);
in.close();
return a.toString();
}
When I do grab the site code this way I get an error about needing to allow cookies. Is there anyway to allow cookies in a java application just so I can grab some source code? I do have the cookie my browser uses to log me in if that helps.
Thanks
John
This way you would have to deal with raw request data, Go with apache http client that gives you abstraction and some methods to allow to set headers in request
I am trying to use a signed java applet to post to a url like:
http://some.domain.com/something/script.asp?param=5041414F9015496EA699F3D2DBAB4AC2|178411|163843|557|1|1|164||attempt|1630315
But when java makes the connection, the java console shows:
network: Connecting http://some.domain.com/something/script.asp?param=5041414F9015496EA699F3D2DBAB4AC2%7C178411%7C163843%7C557%7C1%7C1%7C164%7C%7Cattempt%7C1630315
I do not want java to urlencode the pipes in the query from | to %7c. It seems the service I'm connecting to doesn't urldecode the param, and I can't change the server side code. Is there a way in java to make the post without escaping the query?
The java I'm using is below:
try {
URL url = new URL(myURL);
URLConnection connection = url.openConnection();
connection.setDoOutput(true);
OutputStreamWriter out = new OutputStreamWriter(
connection.getOutputStream());
out.write(toSend);
out.close();
BufferedReader in = new BufferedReader(
new InputStreamReader(
connection.getInputStream()));
String decodedString = "";
while ((decodedString = in.readLine()) != null) {
totalResponse = totalResponse + decodedString;
}
in.close();
} catch (Exception ex) {
}
Thank you for any help!
the URL class does not do any encoding. testing this on my dev server confirmed this suspicion. your code must be encoding the '|' character somewhere before the snippet you included in your question.
I am trying to read in a website and save it to a string. I'm using this code below which works perfectly fine in Eclipse. But when I try to run the program via the command line in Windows like "java MyProgram", the program starts and just hangs and never reads in the URL. Anyone know why this would be happening?
URL link = new URL("http://www.yahoo.com");
BufferedReader in = new BufferedReader(new InputStreamReader(link.openStream()));
//InputStream in = link.openStream();
String inputLine = "";
int count = 0;
while ((inputLine = in.readLine()) != null)
{
site = site + "\n" + inputLine;
}
in.close();
...
It could be because you are behind a proxy, and Eclipse is automatically adding settings in to configure this.
If you are behind a proxy, when running from the command prompt, try setting the java.net.useSystemProxies property. You can also manually configure proxy settings with a few network properties found here (http.proxyHost, http.proxyPort).
I encountered such a problem and found a solution.
Here my working code:
// Create a URL for the desired page
URL url = new URL("your url");
// Get connection
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setConnectTimeout(5000); // 5 seconds connectTimeout
connection.setReadTimeout(5000 ); // 5 seconds socketTimeout
// Connect
connection.connect(); // Without this line, method readLine() stucks!!!
// because it reads incorrect data, possibly from another memory area
InputStreamReader isr = new InputStreamReader(url.openStream(),"UTF-8");
BufferedReader in = new BufferedReader(isr);
String str;
while (true) {
str = in.readLine();
if(str==null){break;}
listItems.add(str);
}
// Closing all
in.close();
isr.close();
connection.disconnect();
If that's all your code is doing, there's no reason it shoudln't work from the command line. I suspect you've cut out what's broken. For example:
public static void main(String[] args) throws Exception {
String site = "";
URL link = new URL("http://www.yahoo.com");
BufferedReader in = new BufferedReader(new InputStreamReader(link.openStream()));
//InputStream in = link.openStream();
String inputLine = "";
int count = 0;
while ((inputLine = in.readLine()) != null) {
site = site + "\n" + inputLine;
}
in.close();
System.out.println(site);
}
works fine. Another possibility would be if you're running it in Eclipse and from the command line on two different computers, and the latter can't reach http://www.yahoo.com.
I want to read a secure webpage data say https://www.paypal.com, i am behind proxy. I tried with
System.setProperty("java.net.useSystemProxies","true");
System.setProperty("htttps.proxyHost","myproxyhost");
System.setProperty("https.proxyPort","443");
URL u = new URL("https://www.paypal.com");
URLConnection uc = u.openConnection();
uc.setDoOutput(true);
StringBuffer sbuf=new StringBuffer();
BufferedReader in = new BufferedReader(
new InputStreamReader(uc.getInputStream()));
String res = in.readLine();
System.out.println(" Response from paypal "+res);
while ((res = in.readLine()) != null){
sbuf.append(res).append(",");
}
in.close();
System.out.println(" Total Data received "+sbuf);
i am getting UnknownHostException all the time, I am successfully fetching data with http websites. Am i missing something?
Thanks,
Rohit
You've got 3 T's in your proxyHost settings, i.e. you are using htttps rather than https.