HTTP GET request in java - java

I'm trying to use http get request on java, to my localhost on port number 4567.
I got the get request: "wget -q -O- http://localhost:4567/XXXX"
[XXXX is some parameter - not relevent].
I've found a java library java.net.URLConnection for these kind of things but it seems that the URLConnection object is supposed to receive the url/port/.. and all other kind of parameters (In other words, you have to construct the object yourself), however, I got the full http get request as I've written above. Is there a way to simply 'shoot' the request without dealing with constructing the field for URLConnection?

You can create the URL object using your URL, it will figure out ports and other things itself. Ref : https://docs.oracle.com/javase/tutorial/networking/urls/readingWriting.html
public class URLConnectionReader {
public static void main(String[] args) throws Exception {
URL oracle = new URL("http://localhost:4567/XXXX");
URLConnection yc = oracle.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(
yc.getInputStream()));
String inputLine;
while ((inputLine = in.readLine()) != null)
System.out.println(inputLine);
in.close();
}
}

Why don't you use Apache HTTPClient library. It is simple to use.
HttpClient client = new HttpClient();
refer the document http://hc.apache.org/httpclient-3.x/tutorial.html

Related

Java http get request slower than postman get request

I'm trying to send a get request in order to get a website content.
When I'm using Postman it takes about 70-100 ms, but when I use the following code:
String getUrl = "someUrl";
URL obj = new URL(getUrl);
HttpURLConnection con = (HttpURLConnection) obj.openConnection();
// optional default is GET
con.setRequestMethod("GET");
//add request header
con.setRequestProperty("User-Agent", "Mozilla/5.0");
BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
String inputLine;
StringBuffer response = new StringBuffer();
while ((inputLine = in.readLine()) != null)
{
response.append(inputLine);
}
in.close();
response.toString();
it takes about 3-4 seconds.
Any idea how to get my code work as fast as Postman?
Thanks.
Try to find a workaround for the while loop. Maybe that is your bottleneck. What are you even getting from your URL? Json object or something else?
Try http-request built on apache http api.
HttpRequest<String> httpRequest = HttpRequestBuilder.createGet(someUri, String.class)
.responseDeserializer(ResponseDeserializer.ignorableDeserializer())
.addDefaultHeader("User-Agent", "Mozilla/5.0")
.build();
public void send(){
String response = httpRequest.execute().get();
}
I higly recomend read documentation before use.

Check if my server is up via java

I am starting a tomcat server in my local for a web application and it takes around 20 minutes to be up and running. I want to check if the web app is up and running and taking any requests via java. Any help?
My server is say at localhost:8001/myapp
Thanks in advance.
You can check it through many ways. Like... Set a servlet as start up on-load and inside it keep some loggers which files log messages along with exact time.
You can add something like localhost:8001/myapp/status to the app that would return information about current status. Then you can just sent http request from java and check the response
public String execute(String uri) throws Exception {
URL url = new URL(uri);
URLConnection connection = url.openConnection();
connection.setReadTimeout(1000);
BufferedReader in = new BufferedReader(
new InputStreamReader(
connection.getInputStream()));
String inputLine;
StringBuffer outputLine = new StringBuffer();
while ((inputLine = in.readLine()) != null)
outputLine.append(inputLine);
in.close();
return outputLine.toString();
}
I guess I will call this method after a certain time period to see if I'm getting a timeout exception of the raw html.

How do I send a cookie while trying to grab a sites source?

I am trying to grab a site's source code using this code
private static String getUrlSource(String url) throws IOException {
URL url = new URL(url);
URLConnection urlConn = url.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(
urlConn.getInputStream(), "UTF-8"));
String inputLine;
StringBuilder a = new StringBuilder();
while ((inputLine = in.readLine()) != null)
a.append(inputLine);
in.close();
return a.toString();
}
When I do grab the site code this way I get an error about needing to allow cookies. Is there anyway to allow cookies in a java application just so I can grab some source code? I do have the cookie my browser uses to log me in if that helps.
Thanks
John
This way you would have to deal with raw request data, Go with apache http client that gives you abstraction and some methods to allow to set headers in request

Create Restful Client Consumer with Tomcat

I am using Apache tomcat 6.0.20
I want to create Client To Consume RESTFul Web Service(using GET)
I know I can do it via the old fashion way with URLConnection (regular GET request).
But I wonder is there any way of doing it differently? maybe with Annotations?
I think this article http://www.oracle.com/technetwork/articles/javase/index-137171.html will give you good guidance how to act in both directions.
I'm currently using the API of spring. The connection handling for example is handled already within the RestTemplate class. Have a look to http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/remoting.html#rest-client-access.
Using NetBeans 7 there is the possibility to have RESTFul web services created with a simple wizard (with Jersey API): http://netbeans.org/kb/docs/websvc/rest.html . This approach uses annotations.
In the end I chose to use the JAVA SE API in the old and fashion way:
public void getRestfullMethod(...) throws IOException
{
String temp = null;
//Build the request data.
StringBuffer buf = new StringBuffer (..)
buf.append("&system=").append ("someVal");
String urlStr = buf.toString ();
//Send the request.
URL url = new URL (urlStr);
URLConnection con = url.openConnection();
//Return the response.
BufferedReader in = new BufferedReader (new InputStreamReader (con.getInputStream ()));
String inputLine = null;
buf = new StringBuffer ();
while ((inputLine = in.readLine ()) != null)
buf.append (inputLine);
in.close ();
}

Using an API of a website in a servlet . Is this the right way?

public java.lang.StringBuffer getRequestURL()
I am using this method to call the API of another website which gives XML data as response to it . Is this the right method to be used with HTTPrequest/response. ?
No. You should use new URL(url).openConnection(), or some abstraction like http components or a rest-client
If you want to make HTTP requests from within a Servlet you do it as you would from any process. Something like this:
public static void main(String[] args) throws Exception {
URL url = new URL("http://www.targetdomain.com/api?key1=value1&key2=value2...");
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setConnectTimeout(5000); // 5 seconds
conn.setRequestMethod("GET");
conn.connect();
BufferedReader rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String line;
StringBuffer bf = new StringBuffer();
while ((line = rd.readLine()) != null) {
bf.append(line);
}
conn.disconnect();
//... pass bf to an XML parser and do your processing...
}
Depending on whatever XML parser you're using, you can probably skip buffering the response and putting it in a StringBuffer, and instead pass your parser the response InputStream directly.

Categories

Resources