I know there are several question regarding this topic But I did't find an answer in any of them.
I'm trying to open a connection to my local server but I keep getting connection refused.
I have the server running and I tested the connection with the Browser and with a Google App called Postman and it works.
It's failing when opening the connection as if there where nothing to connect to. or maybe something is blocking the connection? I tested with firewall and antivirus down, no luck.
testing in Postman the URL returns a User as it should...
If I replace the url with "http://www.google.com" It Works fine.
here is my code:
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.MalformedURLException;
import java.net.URL;
/**
*
* #author Gabriel
*/
public class HttpConnection {
public HttpConnection() {
}
public void makeRequest() throws MalformedURLException, IOException {
String url = "http://localhost:8000/users/1";
URL obj = new URL(url);
HttpURLConnection con = (HttpURLConnection) obj.openConnection();
// optional default is GET
con.setRequestMethod("GET");
//add request header
con.setRequestProperty("User-Agent", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2062.120 Safari/537.36");
con.setRequestProperty("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8");
con.setRequestProperty("Accept-Encoding", "gzip,deflate,sdch");
con.setRequestProperty("Accept-Language", "en-US,en;q=0.8,es;q=0.6");
con.setRequestProperty("Connection", "keep-alive");
con.setRequestProperty("Host", "localhost:8000");
int responseCode = con.getResponseCode();
System.out.println("\nSending 'GET' request to URL : " + url);
System.out.println("Response Code : " + responseCode);
BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
String inputLine;
StringBuffer response = new StringBuffer();
while ((inputLine = in.readLine()) != null) {
response.append(inputLine);
}
in.close();
//print result
System.out.println(response.toString());
}
}
I faced exactly the same problem. Use this instead of localhost:
http://[::1]:8000/index.php
I have similar code that is working, but my request header is a lot simpler. Basically just:
con.setRequestProperty("User-Agent", "Mozilla/5.0");
If simplifying the header does not help, I would capture the traffic when using your browser with something like fiddler and then making the request look exactly like that.
I will make a wild guess what can be the problem. It is possible a IPv4/IPv6 problem.
If so, here is two possible solutions
If the server is only listening on an ipv6 address, change it to listening to ipv4.
If the server is listening to ipv4, then force Java to use ipv4 with
java.net.preferIPv4Stack=true
You can try implementing CORS at the API you are trying to connect by setting access-control-allow-origin:* property in response header.
The code is good and works great. Now the problem must be on the transportation or network part. What I want to mean is you don't request the right server. If you use 127.0.0.1 instead of localhost I think you won't get a problem. So, my guest will be that you have a problem in /etc/hosts or C:\Windows\System32\drivers\etc\hosts.
I advice you to try a simple test: ping the hostname and check in the output if the ip address is good.
Well, put http://localhost:8000/users/1 in your web browser and what do you get? A simple Connection Refused error. It's not you, it's the website. Also, Url returns websites using Protocol Identifiers(http://, https://), Ending Domains(.com, .edu, .gov) that's also another reason why you get an error.
You mentioned that you were opening a connection to your "local server"
I am assuming that you are doing this on the same computer that you're hosting the server on?
Try to open the connection to your local server using a different computer.
Related
This is my code:
URL url = new URL("http://superchillin.com/login2.php");
HttpURLConnection urlConnection = (HttpURLConnection)url.openConnection();
urlConnection.setUseCaches(false);
urlConnection.setRequestMethod("POST");
String data = "email="+URLEncoder.encode(name, "UTF-8")+"&password="+URLEncoder.encode(pass, "UTF-8");
urlConnection.setRequestProperty("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8");
urlConnection.setRequestProperty("Accept-Encoding", "gzip,deflate");
urlConnection.setRequestProperty("Accept-Language", "en-US,en;q=0.8,lt;q=0.6");
urlConnection.setRequestProperty("Cache-Control", "max-age=0");
urlConnection.setRequestProperty("Connection", "keep-alive");
urlConnection.setRequestProperty("Content-Length", Integer.toString(data.getBytes().length));
urlConnection.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
urlConnection.addRequestProperty("Cookie", "place=1");
urlConnection.addRequestProperty("Cookie", "lvca_unique_user=1");
urlConnection.setRequestProperty("Host", "superchillin.com");
urlConnection.setRequestProperty("Origin", "http://superchillin.com");
urlConnection.setRequestProperty("Referer", "http://superchillin.com/login.php");
urlConnection.setRequestProperty("User-Agent", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/37.0.2062.124 Safari/537.36");
urlConnection.setDoOutput(true);
urlConnection.setDoInput(true);
urlConnection.setInstanceFollowRedirects(true);
DataOutputStream wr = new DataOutputStream(urlConnection.getOutputStream());
wr.writeBytes(data);
wr.flush();
wr.close();
After that code I only read the response. It redirects me to "login.php" and is trying to set cookie "place=1"...
Connecting via browser works great. The reason for so many headers is I thought they may be the problem so I copied all headers from which I see when using a browser.
The response code is 200.
I also noticed that if password or email is incorrect, there's a message saying that in HTML which i retrieve.
When I use a browser I get redirected to index.php and cookie "auth" is set. So that's what I'm expecting from my program aswell. Curently I get redirected back to "login.php".
There is no universal answer to this question, I'm afraid. What you're asking is "why does the remote server not return an auth cookie when I send this exact request?" And that depends entirely on what the server's documentation says about those requests, whether it has any bugs in its implementation, etc.
If you don't have access to the server's own source and logs, then you'll likely have to get by with experimentation. Use something like Firebug or Chrome's Developer Tools to capture the exact requests sent by the browser with the login works successfully. Since these text strings are the only thing the remote server sees, if you replicate them exactly with your Java program you will(/should) get exactly the same responses.
If you think you're sending the same requests from Java and find that you're still not getting the expected responses, there must be some difference. Try recording the network traffic with something like Wireshark in order to see exactly what your app is sending - and then address the differences.
And if you get to the point where e.g. a redirect isn't being followed, and you're not sure how to do that with a URLConnection - then that's a good concrete question to ask.
I am trying out a simple program for reading the HTML content from a given URL. The URL I am trying in this case doesn't require any cookie/username/password, but still I am getting a io.IOException: Server returned HTTP response code: 403 error. Can anyone tell me what am I doing wrong here? (I know there are similar question in SO, but they didn't help):
import java.net.*;
import java.io.*;
import java.net.MalformedURLException;
import java.io.IOException;
public class urlcont {
public static void main(String[] args) {
try {
URL u = new URL("http://www.amnesty.org/");
URLConnection uc = u.openConnection();
uc.addRequestProperty("User-Agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.0)");
uc.connect();
InputStream in = uc.getInputStream();
int b;
File f = new File("C:\\Users\\kausta\\Desktop\\urlcont.txt");
f.createNewFile();
OutputStream s = new FileOutputStream(f);
while ((b = in.read()) != -1) {
s.write(b);
}
}
catch (MalformedURLException e) {System.err.println(e);}
catch (IOException e) {System.err.println(e);}
}
}
If you can fetch the URL in a browser, but not via Java, that indicates, to me, that they are blocking programmatic access to the page via user-agent filtering. Try setting the user-agent on your connection so that your code appears, to the webserver, to be a web-browser.
See this thread for help on that: What is the proper way of setting headers in a URLConnection?
There is a permission problem:
A web server may return a 403 Forbidden HTTP status code in response to a request from a client for a web page or resource to indicate that the server refuses to allow the requested action
you are not doing anything "wrong", the server you are trying to access is blocking your request, as you are not allowed to access the file
Http-Error 403 means Forbidden --> the remote server blocks the request.
check if you need to give authentification to access the document you want and in that case provide it with the request ;)
I use simple code to get html for http://www.ip-adress.com, but it shows error http code 403.
I try it in other website like google.com in program, it can work. i can also open www.ip-adress.com in browse, why i can't use it in java program.
public class urlconnection
{
public static void main(String[] args)
{
StringBuffer document = new StringBuffer();
try
{
URL url = new URL("http://www.ip-adress.com");
URLConnection conn = url.openConnection();
BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String line = null;
while ((line = reader.readLine()) != null)
document.append(line + " ");
reader.close();
}
catch (MalformedURLException e)
{
e.printStackTrace();
}
catch (IOException e)
{
e.printStackTrace();
}
System.out.println(document.toString());
}
}
java.io.IOException: Server returned HTTP response code: 403 for URL: http://www.ip-adress.com/
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(Unknown Source)
at urlconnection.main(urlconnection.java:14)
This is the line you required
conn.setRequestProperty("User-Agent", "Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.4; en-US; rv:1.9.2.2) Gecko/20100316 Firefox/3.6.2");
refer this
The web-server can detect that you are not actually trying to access it via HTTP, so it rejects your request. There are ways to fake that to trick the server into thinking that you are a browser.
I suppose the site checks user agent header and blocks what it seems to be "a robot". You need to mimic normal browser. Check this solution Setting user agent of a java URLConnection or try to use commons http client AND set user agent.
I don't believe that this is fundamentally a Java problem. You're doing the right thing to make an HTTP connection, and the server is doing "the right thing" from its perspective by responding to your request with a 403 response.
Let's be clear about this - the response you're getting is due to whatever logic is being employed by the target webserver.
So if you were to ask "how can I modify my request so that http://www.ip-address.com returns a 200 response", then people may be able to come up with workarounds that keep that server happy. But this is a host-specific process; your Java code is arguably correct, though it should have better error handling because you can always get non-2xx responses.
Try to change Connection User-Agent to something like Browsers, most of times I use Mozilla/6.0 (Windows NT 6.2; WOW64; rv:16.0.1) Gecko/20121011 Firefox/16.0.1
I am not able to open a URLConnection with a particular web resource . I am getting
" java.net.ConnectException: Connection timed out:" . Is it because of that domain is blocking the direct URL connection ? If so how they are blocking this ? below is the code snippet i wrote .
Code snippet:
import java.io.;
import java.net.;
public class TestFileRead{
public static void main(String args[]){
try{
String serviceUrl = "http://xyz.com/examples.zip";
HttpURLConnection serviceConnection = (HttpURLConnection) new URL(serviceUrl).openConnection();
System.out.println(serviceConnection);
serviceConnection.addRequestProperty("User-Agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 2.0.50727)");
DataInputStream din=new DataInputStream(serviceConnection.getInputStream());
FileOutputStream fout=new FileOutputStream("downloaded");
DataOutputStream dout=new DataOutputStream(fout);
int bytes;
while(din.available()>0){
bytes=din.readByte();
dout.write(bytes);
}
}catch(Exception ex){
ex.printStackTrace();
}
}
}
You are probably using the proxy setup in your browser to access the Yahoo home page which explains why it works in your browser and not in your code. You require a proxy configuration for your Java application.
The simplest way would be to set the system property http.proxyHost and http.proxyPort when running the code (in Eclipse or when running from command line just add -Dhttp.proxyHost=your.host.com -Dhttp.proxyPort=80) and you should be good to go. Pick up the proxy settings from your browser configuration/settings.
EDIT: This link does a decent job of explaining the possible solutions when dealing with proxies in Java.
Try this, it works fine for me, returning the index page.
String serviceUrl = "http://yahoo.com";
HttpURLConnection serviceConnection = (HttpURLConnection) new URL(serviceUrl).openConnection();
serviceConnection.addRequestProperty("User-Agent", "blah"); //some sites deny access to some pages when User-Agent is Java
BufferedReader in = new BufferedReader(new InputStreamReader(serviceConnection.getInputStream()));
I am trying to make a request to a webpage that requires cookies. I'm using HTTPUrlConnection, but the response always comes back saying
<div class="body"><p>Your browser's cookie functionality is turned off. Please turn it on.
How can I make the request such that the queried server thinks I have cookies turned on. My code goes something like this.
private String readPage(String page) throws MalformedURLException {
try {
URL url = new URL(page);
HttpURLConnection uc = (HttpURLConnection) url.openConnection();
uc.connect();
InputStream in = uc.getInputStream();
int v;
while( (v = in.read()) != -1){
sb.append((char)v);
}
in.close();
uc.disconnect();
} catch (IOException e){
e.printStackTrace();
}
return sb.toString();
}
You need to add a CookieHandler to the system for it handle cookie. Before Java 6, there is no CookieHandler implementation in the JRE, you have to write your own. If you are on Java 6, you can do this,
CookieHandler.setDefault(new CookieManager());
URLConnection's cookie handling is really weak. It barely works. It doesn't handle all the cookie rules correctly. You should use Apache HttpClient if you are dealing with sensitive cookies like authentication.
I think server can't determine at the first request that a client does not support cookies. So, probably server sends redirects. Try to disable redirects:
uc.setInstanceFollowRedirects(false);
Then you will be able to get cookies from response and use them (if you need) on the next request.
uc.getHeaderFields()
// get cookie (set-cookie) here
URLConnection conn = url.openConnection();
conn.setRequestProperty("User-Agent", "Mozilla/5.0 (Windows; U; Windows NT 6.0; pl; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2");
conn.addRequestProperty("Referer", "http://xxxx");
conn.addRequestProperty("Cookie", "...");
If you're trying to scrape large volumes of data after a login, you may even be better off with a scripted web scraper like WebHarvest (http://web-harvest.sourceforge.net/) I've used it to great success in some of my own projects.