This question already has answers here:
How to use java.net.URLConnection to fire and handle HTTP requests
(12 answers)
Closed 9 years ago.
How do I do a HTTP GET in Java?
If you want to stream any webpage, you can use the method below.
import java.io.*;
import java.net.*;
public class c {
public static String getHTML(String urlToRead) throws Exception {
StringBuilder result = new StringBuilder();
URL url = new URL(urlToRead);
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("GET");
try (BufferedReader reader = new BufferedReader(
new InputStreamReader(conn.getInputStream()))) {
for (String line; (line = reader.readLine()) != null; ) {
result.append(line);
}
}
return result.toString();
}
public static void main(String[] args) throws Exception
{
System.out.println(getHTML(args[0]));
}
}
Technically you could do it with a straight TCP socket. I wouldn't recommend it however. I would highly recommend you use Apache HttpClient instead. In its simplest form:
GetMethod get = new GetMethod("http://httpcomponents.apache.org");
// execute method and handle any error responses.
...
InputStream in = get.getResponseBodyAsStream();
// Process the data from the input stream.
get.releaseConnection();
and here is a more complete example.
If you dont want to use external libraries, you can use URL and URLConnection classes from standard Java API.
An example looks like this:
String urlString = "http://wherever.com/someAction?param1=value1¶m2=value2....";
URL url = new URL(urlString);
URLConnection conn = url.openConnection();
InputStream is = conn.getInputStream();
// Do what you want with that stream
The simplest way that doesn't require third party libraries it to create a URL object and then call either openConnection or openStream on it. Note that this is a pretty basic API, so you won't have a lot of control over the headers.
Related
I'm trying to use http get request on java, to my localhost on port number 4567.
I got the get request: "wget -q -O- http://localhost:4567/XXXX"
[XXXX is some parameter - not relevent].
I've found a java library java.net.URLConnection for these kind of things but it seems that the URLConnection object is supposed to receive the url/port/.. and all other kind of parameters (In other words, you have to construct the object yourself), however, I got the full http get request as I've written above. Is there a way to simply 'shoot' the request without dealing with constructing the field for URLConnection?
You can create the URL object using your URL, it will figure out ports and other things itself. Ref : https://docs.oracle.com/javase/tutorial/networking/urls/readingWriting.html
public class URLConnectionReader {
public static void main(String[] args) throws Exception {
URL oracle = new URL("http://localhost:4567/XXXX");
URLConnection yc = oracle.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(
yc.getInputStream()));
String inputLine;
while ((inputLine = in.readLine()) != null)
System.out.println(inputLine);
in.close();
}
}
Why don't you use Apache HTTPClient library. It is simple to use.
HttpClient client = new HttpClient();
refer the document http://hc.apache.org/httpclient-3.x/tutorial.html
Hi i have been trying to search for bugs in bugzilla through rest api methods of bugzilla. To get the bugs I developed code in java which is giving 406 error. Below is my code.
public static void main(String[] args) throws IOException, JsonParser.ParseException,
JSONException, ParseException {
URL url=new URL("http:mybugzilla.com/bug");
HttpURLConnection urlConnection= (HttpURLConnection) url.openConnection();
urlConnection.setRequestProperty("Accept","application/json");
urlConnection.setRequestMethod("GET");
urlConnection.setDoOutput(true);
urlConnection.connect();
PrintStream printStream=new PrintStream(urlConnection.getOutputStream());
//printStream.print();
BufferedReader br = new BufferedReader(new
InputStreamReader(urlConnection.getInputStream()));
String line;
StringBuilder sb=new StringBuilder();
while ((line = br.readLine()) != null) {
sb.append(line).append("\n");
}
System.out.println(sb);
}
According to this Api doc you are probably missing something in your request string.
At the very least your URL url=new URL("http:mybugzilla.com/bug") should be
URL url=new URL("http://mybugzilla.com/bug")
A sample example written in python, to get the list of public bugs from Bugzilla 5.x, using the rest API.
import requests
url_bz_restapi = 'http://localhost/bugzilla/rest.cgi/bug'
r = requests.get(url_bz_restapi)
very strange problem, don't have any idea. Maybe you can help again - as so often :)
I create a simple UrlConnection and use the post-method. When I look into wireshark, everything is send right to me. I try to store the response into a string. And that string is a short version of the entire packet while it is closed right (with a /html-tag).
A diff in notepad gives me like this:
and in wireshark:
<a href="wato.py?mode=edithost&host=ColorPrinter ... muchmuchmore ...
This is the place where it seems to get its cut
Really strange stuff, now this is my code:
public void uploadCsv(File csvFile) throws CsvImportException, IOException {
String sUrl = String.format(urlBaseWato, hostAddress);
String csvFileContent = readFile(csvFile);
ParamContainer params = new ParamContainer().addParam("a", "a")
.addParam("b", b)
.addParam("c", "c");
URLConnection connection = new URL(sUrl).openConnection();
postData(connection, params);
String resp = getResponse(connection); // <---- broken string here :(
...
}
-
private void postData(URLConnection con, ParamContainer params) throws IOException {
int cLen = params.getEncodedParamString().getBytes().length;
con.setDoInput(true);
con.setDoOutput(true);
con.setUseCaches (false);
con.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
con.setRequestProperty("Cookie", authCookie.toString());
con.setRequestProperty("Content-Length", Integer.toString(cLen));
//Send request
DataOutputStream os = new DataOutputStream (con.getOutputStream());
os.writeBytes(params.getEncodedParamString());
os.flush ();
os.close ();
}
-
private String getResponse(URLConnection connection) throws IOException {
connection.connect();
BufferedReader in = new BufferedReader(new InputStreamReader(
connection.getInputStream()));
String line;
String response = "";
while ((line = in.readLine()) != null)
response +=line;
in.close();
return response;
}
Mysterious, I don't have the slightest idea. Can you help me?
Does using IOUtils help?
URLConnection connection = new URL(sUrl).openConnection();
IOUtils.toString(connection.getInputStream(), "UTF-8");
or even:
IOUtils.toString(new URL(sUrl), "UTF-8");
Even if not, always consider it first to reduce the amount of boilerplate in your code.
I guess I should make this an answer in case it's the problem and can get accepted. I've seen cases where closing the socket cuts off the stream that's already been written there. Try putting a Thread.sleep(5000) just in front of the socket closure.
public java.lang.StringBuffer getRequestURL()
I am using this method to call the API of another website which gives XML data as response to it . Is this the right method to be used with HTTPrequest/response. ?
No. You should use new URL(url).openConnection(), or some abstraction like http components or a rest-client
If you want to make HTTP requests from within a Servlet you do it as you would from any process. Something like this:
public static void main(String[] args) throws Exception {
URL url = new URL("http://www.targetdomain.com/api?key1=value1&key2=value2...");
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setConnectTimeout(5000); // 5 seconds
conn.setRequestMethod("GET");
conn.connect();
BufferedReader rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String line;
StringBuffer bf = new StringBuffer();
while ((line = rd.readLine()) != null) {
bf.append(line);
}
conn.disconnect();
//... pass bf to an XML parser and do your processing...
}
Depending on whatever XML parser you're using, you can probably skip buffering the response and putting it in a StringBuffer, and instead pass your parser the response InputStream directly.
I want to download the html source code of a site to parse some info. How do I accomplish this in Java?
Just attach a BufferedReader (or anything that reads strings) from a URL's InputStream returned from openStream().
public static void main(String[] args)
throws IOException
{
URL url = new URL("http://stackoverflow.com/");
BufferedReader reader = new BufferedReader(new InputStreamReader(url.openStream()));
String s = null;
while ((s = reader.readLine()) != null)
System.out.println(s);
}
You can use the Java classes directly:
URL url = new URL("http://www.example.com");
URLConnection conn = url.openConnection();
InputStream in = conn.getInputStream();
...
but it's more recommended to use Apache HttpClient as HttpClient will handle a lot of things that you'll have to do yourself with the Java native classes.